Tesla (and other self driving car companies) have been particularly close-mouthed about crashes, especially when their cars are in self driving mode.
The National Highway Traffic Safety Administration (NHTSA) issued a new rule that pulls the covers off of that secrecy.
Now companies will have to report ALL crashes in which semi-autonomous, steering assist or automatic lane-keeping are involved. Not only does this affect Tesla, but it also affects Waymo, Zoox, Cruise and others.
The new rule says that any crash involving a semi-autonomous system and “a hospital-treated injury, a fatality, a vehicle tow-away, an air bag deployment, or a vulnerable road user such as a pedestrian or bicyclist” must be reported to NHTSA within one day of learning about the crash, with an update submitted 10 days later.
The companies also have to generate monthly reports and provide them to the NHTSA.
To encourage companies to comply, failure to comply will subject companies to fines of $22, 992 per day.
With a maximum fine of $100 million.
I assume that will get even Elon’s attention.
The objective is for the feds to have more data to understand how safe or not some of this new tech is.
Researchers wanted to know if self driving cars could be fooled and how hard it would be. For those of you who don’t want the read the rest of the article, the answer to the first question is yes and the answer to the second question is not very hard.
Here are the test signs. The graphic comes from the Bleeping Computer article linked at the end of this post.
The first sign fooled the machine learning software into thinking it was a “speed limit 45” 100% of the time.
The second and third sign were also thought to be speed limit sign but only 67% of the time.
The fourth sign fooled the software 100% of the time, thinking it was stop sign instead of a right turn sign.
The researchers say that you can solve the problem by not having any graffiti. Sure, that is simple – NOT!
Of course since hackers are aware of this they will likely cause mischief – just to see if they can get the cars to do things they are not supposed to do.
Right now it is not a big problem since there are so few self driving cars and those, for the most part, are only semi-self-driving.
However, as the software gets better hopefully it will not be fooled so easily – we shall see. Now that manufacturers and regulators are aware of the problem, hopefully they will test to see how easily cars can be fooled.
And finally, now that you know, I would suggest that reading a book while your car is driving you around is probably NOT a good plan. Just sayin’.
Information for this post came from The Bleeping Computer.