Tesla’s statistics on Full Self-Driving’s mileage show that customers are only using the software 14.6% of the time. In Tesla’s latest Q4 2023 earnings call, Elon Musk said that 400,000 customers were using Full Self-Driving. This is the same number that Tesla reported a year ago in its Q4 2022 update. In the nine months between March 2023 and December 2023 the cumulative miles driven by Full Self-Driving increased by 625 million miles – approximately 833 million miles per year. FSD was covering approximately 2,082 miles per customer in this period. Data from the Federal Highway Administration shows that the average American drives 14,263 miles. Therefore, FSD is only engaged in approximately 14.6% of the mileage covered by customers who purchased the software for up to $15,000.

Elon Musk says, “the data is unequivocal that supervised Full Self-Driving is somewhere around four times safer, maybe more, than just a human driving by themselves.” It’s not possible that hundreds of thousands of people who paid more than $10,000 for a product that makes them and their families safer would only use it 15% of the time. But they do use FSD only 15% of the time, so “supervised FSD” does not make them and their families safer than “driving by themselves”.

Most people want nice straight-forward lives, and they want their cars to be straight-forward too. As you can see from many of the replies to your post, many FSD buyers try it when they get it and then, as it says in the manual, FSD does “the wrong thing at the worst time” or “suddenly swerve[s] even when driving conditions appear normal and straight-forward”. This scares them, so they turn it off.

 

 

They might try it again when they hear a new update is so much better than the previous release, and that it fixes so many problems, but FSD invariably does something dangerous again and they switch it off again. I’ve seen many Beta testers mentioning on social media that their partners/spouses/families won’t let them turn it on while they’re in the car because it did something dangerous and they’re now afraid of it. The 15% of the time FSD is engaged is made up from the large number of occasional users and relatively few dedicated Beta testers like you, who use FSD almost 100% of the time in order to gather data to train the AI.

Beta testers’ YouTube videos show that it frequently turns into oncoming traffic, fails to yield, blows stop signs, randomly brakes on the freeway, etc. This is not anecdotal, this has happened to me many times. Soon you realize that the only way to be safe when FSD is on, is by constantly looking in every direction and checking your blindspots because it “can suddenly swerve even when driving conditions appear normal and straight-forward”. That causes considerably more stress than just driving manually. Even WholeMarsBlog said “I’m going to spend a week going back to driving manually. I am so much faster, more efficient, and more comfortable that it’s not even funny.”

Most of the 400,000 Full Self-Driving buyers aren’t “Beta testers”. They bought FSD because they believed Elon Musk when he repeatedly said that FSD was safer than a human and would be fully autonomous by the end of the year. They have also heard many advocates saying that FSD makes driving more relaxing. How can we reconcile the fact that FSD is less safe than a human driving themselves with Tesla’s claim that FSD goes 3.2 million miles between airbag deployment crashes, while Teslas not on FSD or Autopilot have a crash every million miles? People only engage FSD 15% of the time. They want to have fewer accidents so they usually enable it only on the easiest roads, in clear weather and without construction, where they know it rarely makes mistakes. So FSD and Autopilot rack up 3.2 million miles between crashes in the easiest driving conditions, like a human would also do if they were driving.

Tesla warns that FSD “may not operate as intended” when there is “rain, snow, direct sun, fog”, or “roads with faded markings”. Of course you would expect far more accidents under these challenging conditions. The human drivers can only drive one million high-risk miles between crashes, where FSD is always turned off because it would cause even more crashes. This explains how FSD can go 3.2 million miles between accidents on the really easy miles where it’s engaged, whereas the superior human driver can only drive one million challenging miles between accidents when FSD is not engaged.

Also improving FSD’s miles between crashes is the fact that a large proportion of the FSD miles are being driven by dedicated Beta testers, who are probably, on average, better than the average driver driving without FSD. They are on the job trying to teach FSD how to drive. They are very experienced with FSD’s limitations. They often have a camera recording them, so they’re probably sober, well rested, and alert at all times. They also know that I would make a lot of noise about it if they crashed! You would therefore expect them to have fewer accidents than average drivers, who are more likely to be distracted, drunk, old or tired.

Furthermore, on its website, Tesla claims Autopilot only crashes once every 5.5 million miles whereas the Department of Transportation statistics show that human drivers crash every 0.5 million miles and conclude that this means that Autopilot is 11 times safer than a human.

 

 

This is a completely false conclusion because Tesla defines a “crash” very differently than the DoT, and unlike the DoT, Tesla excludes all crashes where the airbags did not deploy. The number of accidents is of course much higher than the number of crashes where the airbags deployed. We have found no other recent publicly available statistics on the rate of crashes where airbags deployed but, when I saw that Tesla was comparing their figures for airbag crashes to the DoT’s figures for all crashes, I asked the next 19 people I spoke to if they had been in a crash where the airbags deployed and they all said no. But they all said that they had been an accident where the airbags didn’t deploy. Comparing statistics on airbags deployment crashes to all DoT crashes is dishonest.

The reason people are only using supervised FSD 15% of the time is because it is less safe than a human driving themselves.

 

Dan’s comments were first published in response to Matthew Santoro’s analysis of Tesla Full Self-Driving’s take rate, and can be read here.