Researchers wanted to know if self driving cars could be fooled and how hard it would be. For those of you who don’t want the read the rest of the article, the answer to the first question is yes and the answer to the second question is not very hard.
Here are the test signs. The graphic comes from the Bleeping Computer article linked at the end of this post.
The second and third sign were also thought to be speed limit sign but only 67% of the time.
The fourth sign fooled the software 100% of the time, thinking it was stop sign instead of a right turn sign.
The researchers say that you can solve the problem by not having any graffiti. Sure, that is simple – NOT!
Of course since hackers are aware of this they will likely cause mischief – just to see if they can get the cars to do things they are not supposed to do.
Right now it is not a big problem since there are so few self driving cars and those, for the most part, are only semi-self-driving.
However, as the software gets better hopefully it will not be fooled so easily – we shall see. Now that manufacturers and regulators are aware of the problem, hopefully they will test to see how easily cars can be fooled.
And finally, now that you know, I would suggest that reading a book while your car is driving you around is probably NOT a good plan. Just sayin’.
Information for this post came from The Bleeping Computer.