top of page

August 5, 2025 - Tesla Hid Crash Data and Lied About It, Regulatory Inaction Also Responsible

This is the Telemetry Transportation Daily for August 5, 2025 and I'm Sam Abuelsamid, Vice President of Market Research for Telemetry.


Following the conclusion of a trial in which Tesla was found partly liable for the death of a woman after Autopilot failed to prevent a crash, writer Fred Lambert from the website Electrek reviewed a copy of the trial transcripts. What Lambert found was a trove of very damning evidence against Tesla about its actions following the crash and it should serve as a warning to everyone about connected vehicles. I'll include a link in the show notes to Lambert's article and it is worth reading. 


Autopilot is an advanced driver assistance system that Tesla claims makes its vehicles safer than human drivers and both the branding and Musk's repeated public comments over the past decade imply that it is capable of driving the vehicle more safely than a human. The first known fatal crash with Autopilot occurred in May 2016 when Joshua Brown's Model S drove under a tractor trailer, slicing off the roof and decapitating the driver. In the years since, the system has improved but still makes frequent errors, such as phantom braking and failing to see obstacles like stationary vehicles. In the report following the National Transportation Safety Board investigation of the Brown crash, the NTSB made numerous recommendations that neither the National Highway Traffic Safety Administration or Tesla have acted upon, although other automakers have. 


Among those recommendations were that use of such ADAS features should be limited to the operational design domain or ODD they were designed for and that there should be more robust driver monitoring to avoid predictable misuse.  More than 9 years after Brown's death, NHTSA has never taken action to require such protections. Tesla also refuses to geofence its systems and has extemely limited driver monitoring. 


In the moments after the 2019 crash that took the life of Naibel Benavides, the Tesla Model S uploaded a data file including video and activity logs to Tesla which the company denied the existence of and tried to hide from investigators. It took until 2024 before a forensic investigator finally got evidence from the car's Autopilot computer that Tesla had tried to erase the data and finally managed to get the full file for examination. In the file was evidence that Autopilot was active and made fundamental errors that could have prevented the crash. Among those errors was evidence that the internal maps had marked the intersection as a restricted zone and should have prevented use of Autopilot. Tesla's refusal to prevent predictable driver misuse of the system could have been stopped in 2016 if NHTSA had taken action on those NTSB findings. 


Instead, a regulatory agency that has been dysfunctional since even before the first Trump administration and has only gotten worse in the years since has allowed countless people to die through its inaction. However, even without a regulator, unethical management at Tesla starting with CEO Elon Musk could have stopped this and chose not too in order to perpetuate the narrative that Tesla's are safer than other cars and can drive themselves which has in turn led to the inflation of its stock price. 


While it's unlikely that the current administration in Washington will do anything that will actually protect the American public, it's up to that public to hold Tesla and every other automaker to account by demanding access to all of the data they have. It's also up to consumers to speak with their wallets when companies like Tesla misbehave.


Thanks for listening.

bottom of page