Close
Updated:

Federal Authorities Investigate Tesla Autopilot After Fatal Florida Car Accident

When a 40-year-old Ohio man crashed into a semi-truck on a Florida highway last May, the initial assumption was that this was simply another tragic accident. 

But as investigators would later come to find out, the Tesla (TSLA.O) Model S sedan in the fatal crash was running on autopilot at the time of the collision. While troopers with the Florida Highway Patrol continue their investigation, authorities with the National Highway Traffic Safety Administration (NHTSA) have launched their own inquiry.

A nine-page letter has been sent by the NHTSA to Tesla, requesting answers to a myriad of questions regarding the crash and the features that were reportedly supposed to be engaged at the time but seem to have failed. Specifically, the auto-braking system and forward collision warning system do not appear to have worked as intended. But Tesla has insisted that its vehicles are safe when used as intended. One unnamed Tesla executive quoted by The New York Times said that while the autopilot feature of the vehicle can operate a car on its own for up to three minutes on the highway, drivers have to be ready to take control at a moment’s notice. 

Of course, that begs the question: Are drivers getting mixed signals? If a car is supposed to be self-driving, to what extent can drivers realistically disengage? And if they have to be ready to take over the wheel at any second, is it dangerous to allow them the freedom to only casually watch the road ahead?

Florida investigators in this case found in the vehicle a laptop. The car was equipped with a computer stand, but the laptop wasn’t mounted to it when investigators found it. It was also not on at the time investigators found it, but they don’t know whether the 40-year-old decedent may have been using it at the time of the crash. It is known that the driver was a huge fan of his Tesla. In fact, he made numerous YouTube videos in the vehicle, touting how great it was to safely make videos while behind the wheel. But was it really all that safe? Clearly, something went wrong. There is no evidence that either the driver or the autopilot system’s emergency brake engaged at any point prior to the north Florida car accident.

Still, authorities say it could be weeks or months before they reach any conclusions as to the exact cause of the crash.

The autopilot system in this particular vehicle – and there are about 70,000 on the road with this same feature – is supposed to keep the vehicle in its proper lane, maintain the same speed and operate for a limited time with no driver steering.

Tesla released a statement stressing that the autopilot system is not intended to render the vehicle a “self-driving” mode of transportation. The NHTSA’s investigation touches on how the vehicle was marketed. There has been significant criticism over the fact that Tesla introduced the autopilot feature while it was still in “beta” mode, which typically means manufacturers are still developing it. There are also some engineers who have argued that any system that relies on a driver needing to suddenly resume control of the vehicle are never going to be totally safe.

Call Freeman Injury Law — 1-800-561-7777 for a free appointment to discuss your rights. Now serving Orlando, West Palm Beach, Port St. Lucie and Fort Lauderdale.

Additional Resources:

Laptop in wreckage of Tesla Autopilot car: Florida investigators, July 7, 2016, By Paul Lienert, Reuters

More Blog Entries:

Orlando Car Accident Death Lawsuit Verdict for $10M Against Domino’s, July 3, 2016, West Palm Beach Car Accident Lawyer Blog

Contact Us