The Dawn Project has analyzed many hours of YouTube videos taken by drivers of their experiences with so-called Full Self-Driving cars.

About every 8 minutes, Full Self-Driving commits a Critical Driving Error, which according to the California DMV’s Driving Performance Evaluation is a major error such as:

  • Making contact with an object when it could have been avoided
  • Disobeying traffic signs or signals
  • Disobeying safety personnel or safety vehicles
  • Making a dangerous manouver that forces others to take evasive action

On average, about every 36 minutes of city driving, safety defects in the Full Self-Driving software cause a malfunction in the steering, braking, or throttle, that if not corrected by the driver would likely cause an accident.

I cannot think of any other commercial product that has a critical safety malfunction (or any serious malfunction) every 8 minutes. Who would buy anything that malfunctions every 8 minutes? A hair dryer, I think not. A stove, or refrigerator, or light switch, or TV, or computer running Windows 3.0 in 1990, or spreadsheet, or Internet browser, a DVR, or headphones, or an airplane, or heater, or air conditioner, or a fork.

Full Self-Driving drives like a suicidal drunken teenager. It will likely wreck your Tesla in 36 minutes of city driving if you don’t react quickly enough to stop it from doing insane things. The YouTube videos show Full Self-Driving trying to ram into giant concrete columns, trying to go the wrong way down a one-way street, trying to ram into cars parked on the road, driving in bus lanes, crazily shifting back and forth between lanes, and charging towards pedestrians in a crosswalk.

Almost all the YouTube videos were made in the daytime under good weather conditions. Under ideal conditions, Full Self-Driving is a horrible driver. How do you think it performs at night in heavy rain on an unlit winding two lane road? The oncoming car headlights reflect off the sheets of water on the road making it impossible to see any road markings. Water on the camera lens refracts the oncoming headlights making a crazy quilt of light that makes it impossible to see the right edge of the road. All you can see is a cacophony of light from the distorted headlights coming straight at you at a closing speed of 80 miles per hour. What do you do, Full Self-Driving?

To simulate the Full Self-Driving experience, put a 9-year-old child on your lap and let them steer your car while you stand by to take over every time they put you, them, the other passengers in your car, or the public in danger. Why don’t we let 9-year-olds learn to drive this way? Don’t they need lots of real world experience to properly train their neural networks? The critical need for the driver to correct each of the 9-year-old’s mistakes before they kill someone would be impaired if the driver were tired, angry, or drunk. It requires a similarly impaired mental state to see the upside of allowing Full Self-Driving, which has the driving skills of a 9-year-old, to learn to drive cars on our roads just because someone in an unknown state of mental impairment is there to override its proclivity to kill people. If a cop saw you let a 9-year-old steer your car, you would probably be arrested. If anyone was killed, you would end up in jail. Full Self-Driving should not be on the road in its current condition and regulators need to act responsibly and step in.

Watch the videos on YouTube. Notice from the time stamps how often Full Self-Driving puts the occupants and the general public in imminent danger. Count how many traffic offenses it commits that would have gotten the driver a ticket if a cop had been present. Watch how often it makes mistakes that would have caused it to flunk a driver’s test to get a driver’s license.

These videos were not made by the critics of Tesla, or short-sellers, or the agents of the oil companies. Nearly all these videos were made by obvious Full Self-Driving fans who praise it highly every time it successfully completes a manouver that every human driver is expected to execute with ease. And they excuse its every clumsy failure. They say the same sort of things you would say to a toddler you are toilet-training. The YouTube fans of Full Self-Driving have very low expectations for its maturity.


How Close is Tesla to Full Self-Driving?

The YouTube videos indicate that without continuous human correction, Full Self-Driving, in its current form, under ideal weather conditions, would cause an accident about every 36 minutes of city driving. Bureau of Transportation statistics for 2019 say that in the U.S. there were 6,756,000 crashes in 3,261,772 million vehicle-miles driven. This means that there is an average of 483,000 miles driven between crashes. At an average speed of 30 MPH there is an average of 16,000 hours between accidents caused by human drivers. This is 24,000 times longer than the time between accidents that would be caused by autonomous Full Self-Driving today.

After 8 years of development, Full Self-Driving is not 95% done. It is not 90% done, or even 50% done. It is not sufficient to make it twice as reliable, or 10 times as reliable, or 100 times, or 1,000 times, or even 10,000 times more reliable.

Before Full Self-Driving can be allowed to drive autonomously on our roads, Tesla must improve its reliability to at least 24,000 times its current reliability.

How long is it going to take to make Full Self-Driving 24,000 times more reliable? We can measure the progress in improving reliability by looking at the YouTube videos from 9 months ago versus the most recent videos. We compare the time between Critical Driving Errors with the latest software (about every 8 minutes) versus the time in one year old videos (about every 4 minutes). And the time between likely accidents from 13 minutes to 29 minutes. There are not as many older videos so that data is less reliable. But using the data that we have, it seems that Full Self-Driving has about doubled the time between Critical Driving Errors and likely accidents in the last 9 months. That sounds pretty good. But if progress continues at this rate, and even if it develops exponentially, it will take over 10 years for Full Self-Driving reliability to reach the level of the average human driver (including all the teenagers and alcoholics). If, finally, in 2032, after waiting for over 10 years, Full Self-Driving was enabled on every car in the world, it would only kill a million or so people each year (like humans do). How many lawsuits would that generate?

Defenders of Full Self-Driving respond indignantly by yelling “IT IS BETA, YOU IDIOT!” It doesn’t matter what you call it. The reality is ordinary citizens are supervising thousands, one report says 12,000, 120 mph, two ton killing machines being driven by possibly the lowest reliability software you will ever encounter.


Unsafe At Any Speed

The book Unsafe at Any Speed written by Ralph Nader became a bestseller in 1966 for reporting that the Chevrolet Corvair had safety defects in the control systems in extreme situations. The safety defects in Full Self-Driving are not limited to extreme situations, they affect all situations and are far more severe than any problems in the Corvair. We must not tolerate having a product on our roads that in normal operation makes a Critical Driving Error that puts people’s lives in danger about every 8 minutes in city driving. What is the point of a product that makes a car far more dangerous than it would be otherwise?

The Full Self-Driving (Beta) 2021.4.10.12 Release in the hands of the public has the following warning from Tesla: “It may do the wrong thing at the worst time.” How can anyone tolerate a safety-critical product on the market which may do the wrong thing at the worst time. Isn’t that the definition of defective? Full Self-Driving must be removed from our roads immediately.

The Dawn Project is offering $10,000 to the first person who can name another commercial product from a Fortune 500 company that has a critical malfunction every 8 minutes. To apply, and for full Terms and Conditions, please click here.


To download the verification documents for our claims about Tesla full self-driving please click the links below: