In 2018, Tesla made the ostensibly transparent step to begin publishing quarterly safety data for its vehicles in order to demonstrate their safety. More recently, it has also released an annual “Impact Report“, which included data on drivers who use their Full Self-Driving (Beta) software.
These data made available by Tesla have given a veneer of credibility to the statement that self-driving Teslas are safer than national average, and is constantly referenced by those who adumbrate such claims.
What Tesla did not realise, however, is by making their safety data publicly available, they were in fact showing to the world the ways in which they were overstating the safety of their software, and that the data in fact shows that Tesla’s self-driving software is up to five times more dangerous than a human driver.
The figures in these reports are ostensibly reassuring, showing that Tesla vehicles record far fewer airbag deployments than the average driver, and Tesla drivers record far fewer accidents per million miles than the US average:
Tesla’s statistics give the impression that a combination of the safety features enclosed within their self-driving software can increase the average number of miles driven before an accident occurs by up to nine times, in comparison with the national average in the fourth quarter of 2022.
However, upon scrutinising Tesla’s methodology, listed in the fine-print buried at the bottom of the Safety Report webpage. An update posted in January 2023 reveals the following:
As part of Tesla’s commitment to continuous improvement, recent analysis led us to identify and implement upgrades to our data reporting. Specifically, we discovered reports of certain events where no airbag or other active restraint deployed, single events that were counted more than once, and reports of invalid or duplicated mileage records. Including these events is inconsistent with our methodology for the Vehicle Safety Report and they will be excluded going forward.
So Tesla only records an incident as a crash if the airbags were deployed, and it has discounted all accidents “when no airbag or any other active restraint deployed”.
According to the Society of Automative Engineers, it is estimated that there have been 2.1 million airbag deployments in the past ten years in the United States, an average of 210,000 deployments each year. American drivers drive approximately 3.2 trillion miles each year.
This makes the US average between airbag deployments 15.2 million miles. Not 0.5 million miles, as Tesla incorrectly asserts. In the fourth quarter of 2022, Tesla vehicles on Autopilot were therefore three times more likely to crash than vehicles from other manufactures.
How does it compare for Tesla Full Self-Driving (Beta)?
When the US average airbag deployment per million miles is compared with Tesla’s Full Self-Driving (Beta) software, the figures are even worse for Tesla.
On average airbag deployments in the US are every 15.2 million miles. According to Tesla, Full Self-Driving (Beta) only averages 3.2 million miles between collisions where airbags are deployed, which means that Tesla Full Self-Driving is five times more likely to have crash where airbags are deployed than a human driving a car.
At Tesla’s Investor Day in March 2023, Director of Autopilot Software Ashok Elluswamy told investors that Tesla drivers using Full Self-Driving (Beta) averaged 3.2 million miles between collisions , which means that Tesla Full Self-Driving is five times more likely to have crash where airbags are deployed than the US average.
The above data perhaps confirms why Tesla refuse to report its crash statistics relating to Full Self-Driving in accordance with any reporting norms.
The above slide from Tesla’s Investor Day in March 2023 shows that Tesla are misleading the public and investors by comparing apples with oranges. The quoted figure of half a million miles per collision for the US average correlates to all crashes, regardless of whether there was an airbag deployment. The corresponding figure for the number of crashes recorded for drivers using FSD Beta follows Tesla’s own definition of a crash, which encompasses only incidents when airbags were deployed, in accordance with the small print in their safety reports.
Ashok’s safety claims were deliberately manipulated to overlook the fact that Tesla’s self-driving software is five times more dangerous than the average human driver. If Tesla had compared apples to apples, it would have exposed the critical safety defects in Tesla Full Self-Driving (Beta).
Tesla Full Self-Driving program continues to be built upon a foundation of lies. This should not come as a surprise given Ashok’s recent testimony from deposition in June 2022, in which he admitted that Tesla’s demonstration of Full Self-Driving’s capabilities in 2016 were indeed not a true reflection of what the software could achieve, but rather that “the intent of the video was not to accurately portray what was available for customers in 2016, it was to portray what was possible.”
Elluswamy also admitted that when Tesla were filming the video, Autopilot crashed “into a fence in our parking lot.”
It comes as no surprise therefore, that the same team that staged the demonstration of Full Self-Driving’s real-world capabilities in 2016 continues to mislead the public about safety to this day.