Tesla’s safety statistics are misleading. The company and its CEO, Elon Musk, deploy misleading statistics to claim their self-driving vehicles are safer than human drivers. The reality, however, is far less simple. We simply do not know how safe Tesla’s self-driving software is because Tesla refuses to tell us. 

In essence, Tesla’s crash statistics misconstrue what it classifies as the national average of crashes in comparison to those recorded by Tesla’s self-driving software. The figures in Tesla’s reports are ostensibly reassuring, showing that Tesla vehicles record far fewer airbag deployments than the average driver, and Tesla drivers record far fewer accidents per million miles than the US average.

However, Tesla only counts crashes as those in which airbags were deployed, counting only serious crashes, whereas the comparative figure it uses for the US average counts all crashes, which means the number is far greater.

Tesla compares apples with oranges, and uses this deception to mislead consumers and regulators to claim that their vehicles are far more safe than they really are. Tesla does not provide the real statistics regarding the safety of their self-driving software, likely because this would contradict Tesla’s propaganda. 

We therefore do not know how safe Tesla vehicles are because Tesla will not tell us. 

In 2018, Tesla made the ostensibly transparent step to begin publishing quarterly safety data for its vehicles in order to demonstrate their safety. More recently, it has also released an annual “Impact Report“, which included data on drivers who use their Full Self-Driving (Beta) software.

These pieces of data made available by Tesla have given a veneer of credibility to the statement that self-driving Teslas are safer than national average, and are frequently referenced by those who adumbrate such claims.

What Tesla did not realise, however, is by making their safety data publicly available, they were in fact showing to the world the ways in which they were overstating the safety of their software, and the methods Tesla uses to obfuscate the true safety of their vehicles. These figures ostensibly show that Tesla drivers record far fewer accidents per million miles than the US average:

 

Tesla’s statistics give the impression that a combination of the safety features within their self-driving software can increase the average number of miles driven before an accident occurs by up to nine times, in comparison with the national average in the fourth quarter of 2022.

However, upon scrutinising Tesla’s methodology, listed in the fine-print buried at the bottom of the Safety Report webpage. An update posted in January 2023 reveals the following:

“As part of Tesla’s commitment to continuous improvement, recent analysis led us to identify and implement upgrades to our data reporting. Specifically, we discovered reports of certain events where no airbag or other active restraint deployed, single events that were counted more than once, and reports of invalid or duplicated mileage records. Including these events is inconsistent with our methodology for the Vehicle Safety Report and they will be excluded going forward.”

So, Tesla only records an incident as a crash if the airbags were deployed, and it has discounted all accidents “when no airbag or any other active restraint deployed”.

This pattern of obscuring the truth when it comes to the safety of Tesla’s vehicles and self-driving software is unfortunately widespread, leaving consumers and regulators guessing the true safety of Tesla’s fleet of vehicles. 

We don’t know the answers. Tesla knows the true figures but refuses to disclose them, likely because they contradict Elon Musk’s frequent claims that Tesla’s self-driving software is far safer than a human driver.

How is the public supposed to trust Musk’s recent claim made at the New York Times Dealbook Summit that “supervised Full Self-Driving is somewhere around four times safer than a human driving by themselves”?

At Tesla’s Investor Day in March 2023, Director of Autopilot Software Ashok Elluswamy told investors that Tesla drivers using Full Self-Driving (Beta) averaged 3.2 million miles between collisions, claiming Tesla Full Self-Driving is six times less likely to have crash where airbags are deployed than the US average.

The above slide from Tesla’s Investor Day in March 2023 shows that Tesla is misleading the public and investors by comparing apples with oranges. The quoted figure of half a million miles per collision for the US average correlates to all crashes, regardless of whether there was an airbag deployment. The corresponding figure for the number of crashes recorded for drivers using FSD Beta follows Tesla’s own definition of a crash, which encompasses only incidents when airbags were deployed, in accordance with the small print in their safety reports.

Ashok’s safety claims overlooked the fact that Tesla’s self-driving software is far more dangerous than the average human driver. If Tesla had compared apples to apples, it would have exposed the critical safety defects in Tesla’s self-driving software.

The Tesla Full Self-Driving program continues to be built upon a foundation of lies. This should not come as a surprise given Ashok’s recent testimony from deposition in June 2022, in which he admitted that Tesla’s demonstration of Full Self-Driving’s capabilities in 2016 were indeed not a true reflection of what the software could achieve, but rather that “the intent of the video was not to accurately portray what was available for customers in 2016, it was to portray what was possible.”

Elluswamy also admitted that when Tesla was filming the video, Autopilot crashed “into a fence in our parking lot.”

It comes as no surprise therefore, that the same team that staged the demonstration of Full Self-Driving’s real-world capabilities in 2016 continues to mislead the public about safety to this day.

Another touchstone that Tesla supporters frequently point towards is the Euro New Car Assessment Programme (NCAP)’s tests of Tesla vehicles, which is often used to claim that Tesla vehicles are far safer than other manufacturers, and that Tesla’s active safety systems protect those around them.

Elon Musk himself has relied on Euro NCAP’s testing videos, who gave the 2022 Tesla Model Y a five star rating in their tests, to claim that Tesla are word-leading in pedestrian safety.

More recently, Tesla’s Rohan Patel used Euro NCAP’s testing to support his claim that Tesla vehicles have “best-in-class performance” for safety.

However, the frequent use of Euro NCAP’s tests to support this narrative in fact supports a very different conclusion to that which Tesla supporters promulgate online: that Tesla’s driver assistance features are not as safe as Tesla claims.

The video circulated online of the European agency’s tests in fact relates to the testing of a 2022 Tesla Model Y, which assessed the vehicle’s standard safety features rather than Tesla’s Advanced Driver Assistance Systems, such as Autopilot or Full Self-Driving. Euro NCAP considered Tesla’s standard safety features to be impressive and consequently gave the Model Y a five-star review.

However, the same agency also tests manufacturers’ driver assistance systems, and drew a very different conclusion about the safety of Tesla’s Autopilot system. Euro NCAP’s Assisted Driver Gradings in 2020 placed Tesla’s Autopilot system sixth out of a possible ten manufacturers, scoring much lower than Tesla’s standard Model Y.

Perhaps unsurprisingly, Euro NCAP has not tested Tesla’s Autopilot system since 2020, perhaps due to the poor performance of Tesla in the 2020 rankings. 

Euro NCAP’s testing of Tesla’s Autopilot software found that it performed particularly poorly on Driver Monitoring, with a score of 10 out of a possible 25 points, and Driver Collaboration, with a score of 0 out of 25, reflecting the fact that Tesla’s self-driving software does not function as a driver assistance system, and disengages when the driver performs any active driving task, such as steering or braking.

The report produced by Euro NCAP also noted that “Tesla’s system name Autopilot is inappropriate as it suggests full automation.”

Euro NCAP gave Tesla Autopilot an overall score of ‘Moderate’, the second lowest score possible. Only Renault and Peguot received the ‘Entry’ ranking, the only lower classification.

 Tesla ranked behind the ‘Assisted Driving’ systems of Audi, BMW, Mercedes, Ford, and Nissan.

 Tesla’s poor performance compared with Audi’s Adaptive Cruise Assist is made clear by the agency’s overall assessment of both systems:

Category (out of) Audi Q8 Tesla Model 3
Consumer Information (25) 20 10
System Status (25) 25 16.5
Driver Monitoring (25) 10 10
Driving Collaboration (25) 23 0*
Speed Assistance (25) 24.9 16.7
Adaptive Cruise Control Performance (40) 29.1 40
Steering Assistance (35) 30 35
System Failure (25) 25 25
Unresponsive Driver Intervention (25) 20 20
Collision Avoidance (50) 39.3 50

 

The overwhelming conclusion that emerges makes uncomfortable viewing for those who wish to leverage Euro NCAP’s testing in Tesla’s favour. In juxtaposition to the prevailing view among Tesla supporters, Euro NCAP’s testing shows that when Tesla’s ADAS software is activated, the overall safety of the vehicle is impaired and comparatively less safe to Tesla vehicles without these features. 

The public deserves transparency when it comes to the safety of Tesla’s self-driving software and The Dawn Project supports Euro NCAP’s independent tests of Tesla’s self-driving vehicles. They present a clear picture that Tesla’s self-driving software has critical limitations and are not as safe as other manufacturers, contrary to Elon Musk’s endless claims.

Anyone who doesn’t own Tesla stock can see that Full Self-Driving, with a human driver correcting its worst errors, is worse than a drunk driver. In fact, drivers have reported thousands of critical safety defects in Tesla’s Full Self-Driving software.

Despite this, Tesla fanboys continue to claim that Full Self-Driving, in conjunction with an alert driver to supervise the software, is safer than a human driver. This claim is seemingly corroborated by Tesla’s own safety statistics, testifying to FSD’s safety.

Tesla’s fans argue that as a driver assistance system, the combination of Full Self-Driving and an alert human driver surpasses the safety of a human driver alone. However, Tesla Full Self-Driving does not function as a driver assistance system, but rather takes away control from the human driver. Full Self-Driving cannot assist the driver in any way, so it is not an Advanced Driver Assistance System. Musk’s own words support this, as he recently responded to former Autopilot Vision at Tesla Andrej Karpathy that most people on earth are not aware that “Tesla FSD” vehicles “can drive themselves”:

Tesla’s self-driving project relies on Tesla’s “statistics” that claim FSD is safer than an average driver, which fanboys rely upon when attempting to address valid criticism directed at FSD.

However, these safety statistics were fabricated in the same propaganda factory that:

  • Ordered engineers to create the fake FSD demonstration video.

In October 2016, Elon Musk ordered his self-driving team to develop a slick demo video of a self-driving car driving a typical employee commute to Tesla headquarters.  In an email, Musk said, “Just want to be absolutely clear that everyone’s top priority is achieving an amazing Autopilot demo drive”. “Since this is a demo, it is fine to hardcode some of it, since we will backfill with production code later.”  After viewing a version of the video Elon Musk said that there were still too many jump cuts, and that the demo footage “needs to feel like one continuous take.”

Musk told his team, “I will be telling the world that this is what the car *will* be able to do, not that it can do this upon receipt.”

Elon Musk had a clear vision for the video. According to internal emails, when the video was finally up to Elon Musk’s standards, he dictated that the following opening text be added to the video:

The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.

https://www.youtube.com/watch?v=ivTeW4xWQv0

In October 2016, Elon tweeted that the video showed a Tesla “driving itself” with “no human input at all”, linking to the video of the journey on the Tesla website, which has now been delisted:

https://twitter.com/elonmusk/status/789019145853513729?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E789019145853513729%7Ctwgr%5E%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.businessinsider.com%2Fsteve-wozniak-rants-against-elon-musk-and-tesla-2018-1

The video is not one continuous take (it only looks that way).  It was assembled from short clips made during 3 days of shooting over 500 miles of driving (avoiding heavy traffic). In that time there were 182 disengagements when the car told the driver to take over or the driver took over to prevent an accident!  

When asked if the Tesla drove up over a curb, through bushes and hit a fence, Elluswamy testified: “I’m not sure about the curb or the bush, I do know about the fence.”  The film of that ended up on the cutting room floor.

Tesla and Musk did not disclose when releasing the video that engineers had created a three-dimensional map for the route the Model X took, Elluswamy said during his deposition.  Musk said years after the demo that the company doesn’t rely on high-definition maps for automated driving systems, and argues systems that do are less able to adapt to their surroundings.  This drive depended on mapping technology that has never been in any Tesla production cars.

One YouTuber, former Tesla employee AI Addict, recently recreated the Paint It Black video in order to determine whether FSD Beta could handle the same route over six years later. The recreation showed that Tesla still couldn’t handle the basic route it was purported to have driven all those years ago, even blowing past a yield sign after just two minutes, causing the driver to exclaim “Oh God, f*ck!”

The Dawn Project also recently recreated the route used in the Paint It Black video, which found that Tesla’s self-driving still cannot complete the simple route and attempted to pull out into oncoming traffic.

Tesla’s self-driving dream from its outset until today is based firmly on a foundation of propaganda, despite Elon’s public claims at the time that the video was indeed an actual reflection of FSD’s capabilities.

  • Ordered employees to lie to customers about range.

In July 2023, a report revealed that Tesla employees were under instructions to thwart any customers who complained about their vehicles not being able to achieve the full distance stated by the range estimate provided by their vehicle.

It was also reported that Tesla had established a “Diversion Team” to deny service appointments for customers complaining about the range of their vehicles. At Tesla’s Nevada base, the report uncovered concerning examples of employees celebrating cancelling appointments for customers with complaints about the their vehicle’s range by striking a xylophone, provoking outbursts of applause from other Tesla workers, who reportedly were ranked internally according to how many service appointments they cancelled.

  • Ordered customer support to not record customer complaints about unintended acceleration and phantom braking.

Another recent landmark investigation revealed that Tesla’s track record for impeding legitimate complaints about their products extended beyond exaggerating driving range, but also incorporating cases of unintended acceleration and phantom braking.

Instances of unintended acceleration have become widely reported, and the Office of Defects Investigation (ODI) recently received a petition to re-evaluation of its decision to open a defect investigation into unintended acceleration cases in Tesla vehicles. The petition referred to open-source analysis of the Tesla Model 3 inverter design, which showed that “negative spikes in Tesla’s low-voltage system can be interpreted as a full acceleration command even though the driver did not touch the accelerator”. In February 2022, the National Highway Traffic Safety Administration opened an investigation into complaints relating to “phantom braking”, after a spike in complaints.

Files retrieved as part of the ‘Tesla Files’ by German publication Handelsblatt documented that Tesla received more than 2,400 complaints alleging instances of unintended acceleration in its vehicles. Evidence from internal sources also revealed that Tesla customer service employees were instructed to handle such complaints verbally, without committing details of consumers’ reports of cases of unintended acceleration and phantom braking to writing. Information obtained by Handelsblatt showed that, when dealing with such complaints, employees were instructed: “Do not copy the report below into an email, text message or leave it in a voicemail to the customer”.

  • Promised that a Model 3 bought in 2019 would not only not depreciate but would increase in value to a couple hundred thousand dollars.

In April 2019, Elon Musk publicly pronounced on Twitter that, due to the increase in the underlying value of its Full Self-Driving package, the value of its vehicles would appreciate over time.

He later argued that, in the future, vehicles purchased in 2019 would be “worth $100k to $200k”, with the Model 3 valued at approximately “$75k”.

Since Musk’s statements, the price of Tesla’s Full Self-Driving package has now increased to $15,000, while its Model 3 has been the subject of a series of price cuts, and currently start from $40,240.

  • Said in 2016 that a Model S and Model X can drive autonomously safer than a human.

In 2016, Elon Musk told an audience that a Tesla Model S and Model X can “at this point, drive autonomously with greater safety than a person”. Almost eight years later, Tesla still warns that their self-driving software “may do the wrong thing at the worst time”, confirmed by at least 27 fatalities and 1,002 accidents attributed to its ADAS technology in NHTSA’s SGO data.

  • Promised that a Model 3 would operate for a million miles with minimal maintenance at an all in cost of one-third the per mile cost of a comparable gasoline powered car.

In April 2019, Musk told investors that Tesla were developing a new battery pack which would be able to last one million miles, and that “the cars currently being built are all designed for a million miles of operation. The drive unit is designed, tested, and validated for 1 million miles of operation.”

  • Stated that Tesla would pay every Model 3 owner $30,000 per year of passive income if they let Tesla use their Model 3 as a robotaxi when they weren’t using it.

Elon has even claimed that Tesla’s robotaxi fleet would generate income for Tesla owners if they rented out their vehicles, in the same model as Uber and Airbnb, leasing their vehicles via a ride-hailing app. He further claimed that this would generate up to $30,000 a year for Tesla owners, during an announcement at Autonomy Day in April 2019. To this day, no revenue has ever been generated from Tesla robotaxis.

  • Repeatedly overstates the progress of the development of products that remain vaporware to this day.

A cursory Google search of “Musk unveils” and “Tesla reveals” returns a string of headlines over the years announcing various supposed achievements. The problem with these and other such “unveilings” is these products remain vaporware. And while the articles themselves acknowledge this, collectively they perpetuate the narrative about Elon being a manufacturing visionary. In the case of the next generation Roadster, customers long anticipated this “products” for years, which remains vaporware and has yet to be delivered.

Another prime example of Elon misleading investors and the public is his claim from 2016 that he was launching the Boring Company, to “build a tunnel” to avoid traffic, in what became another of Elon Musk’s failed enterprises.

“It shall be called The Boring Company,” he added. “Boring, it’s what we do.”

In 2017, he tweeted that the plan was to “start digging in a month or so” as part of a tunnel system in Los Angeles, though at the same time it was revealed that he had failed to obtain permission from city planning officials to do so.

  • Has made numerous baseless claims about landing on Mars

Elon’s baseless claims about being able to settle on Mars have a long history. Musk has been discussing his plans since 2004, according to a contemporary article by The Guardian. He claimed in 2011 that he would put “a man on Mars in 10 years“, which he then re-evaluated in a year’s time to a “12 to 15” year timeframe.

Musk’s timeline has continued to shift over time, with Musk saying almost a decade later in November 2020, that it would take “10 or 20 years” to settle on Mars, a twenty year discrepancy from his first estimate in 2011. This shifting estimation underwent a considerable volte-face just a month later, when he told an audience in Berlin that he was “highly confident” that SpaceX would be able to land humans on Mars in “about six years from now“.

  • For each of the last eight years stated that Tesla would achieve full autonomous driving within a year.

In fact, Elon has been promising the delivery of robotaxis “next year” for the past 8 years. These have still not yet been delivered and are not even close to being ready:

Elon’s lies and manipulations of the truth when it comes to Full Self-Driving are well-documented. In 2019 he claimed that Tesla’s self-driving software would be able to “drive around a parking lot, find an empty spot” as well as “read signs“. He even claimed in 2016 that within two years self-driving would be able to drive from Los Angeles to New York without a driver:

At the company’s Autonomy Day in 2019, Musk claimed that by 2020, Tesla would have one million robotaxis on the roads. Investors, reporters and enthusiasts alike were enthralled by the idea that a Tesla would soon deploy a fleet of driverless robotaxis to revolutionize transit. Musk boasted during the event that within two years, Tesla would be making vehicles with no pedals or steering wheels.

Casting aside the mammoth task of achieving full autonomy, Musk quipped that “all you need to do is improve the software”, stating that Tesla’s hardware was good enough to provide the base on which Tesla would build a fully autonomous car. This April will mark five years since Elon promised Tesla’s investors Level 5 autonomy. As of today, Tesla has not achieved any of these lofty goals. There are zero Tesla robotaxis, and Full Self-Driving is still riddled with errors and fails in the most basic driving situations.

The Dawn Project has documented the critical safety defects in Tesla’s self-driving software, demonstrating that Tesla’s self-driving software will: 

The Dawn Project recently put Tesla’s self-driving software through a California DMV driving test in Santa Barbara, CA, in which the Tesla failed the test on four separate points, leading to an overall fail for Tesla’s self-driving software. If Tesla’s self-driving cars cannot even pass a standard DMV driving test, how can they possibly be four times safer than a human driver, as Elon Musk has claimed?

Even Tesla Full Self-Driving’s largest and most vocal advocate, Omar Qazi, demonstrated how unsafe Tesla’s system is in a recent viral video. No student driver would drive as badly as Tesla’s self-driving system did in Omar’s video. During the short ride in San Francisco, Omar’s Tesla ran two stop signs, swerved into parked vehicles and oncoming traffic, displaying in plain sight the dangers of this dangerous and defective software.

Conclusion

We do not know how safe Tesla vehicles are because Tesla will not tell us. 

Tesla obfuscates the true safety of its self-driving vehicles by providing consumers, investors and regulators with deceptive statistics, irrelevant and outdated testing information and false promises about the future safety of their vehicles. In the meantime, at least 27 people have died from Tesla’s self-driving experiment, with over 1,000 confirmed crashes.

While we don’t know the true statistics of the number of crashes involving Tesla’s self-driving software compared with the national average, we do know that Tesla’s statistics are invalid, and therefore irrelevant. Most drivers have not had an airbag deployment, and most collisions are minor fender-benders. The true national average of non-airbag crashes would likely cast Tesla’s safety record in a highly negative light.

What we do know is that Tesla self-driving drives like a drunk and makes critical driving errors which endanger the lives of all road users.