The Dawn Project has conducted safety tests that reveal that Tesla’s Full Self-Driving software product is not safe for public roads.
This page contains key information and highlights of The Dawn Project’s findings to date, including revelations of critical safety problems in Tesla’s Full Self-Driving software that pose a clear danger to public safety, together with the insights of the Founder of The Dawn Project, Dan O’Dowd.
During 2022, The Dawn Project conducted a series of safety tests in Goleta and Santa Barbara, California, to test whether Tesla’s Full Self-Driving Software was safe for public roads. The tests revealed critical safety defects in the software, and showed that Tesla’s Full Self-Driving software will:
- Run down a child in a school crosswalk
- Ignore ‘Do Not Enter’ and ‘Road Closed’ signs
- Overtake stopped school buses
- Speed in school zones
- Hit a child in a stroller
- Run over children crossing the road
- Swerve into oncoming traffic
- Ignore road signs warning it to slow down for sharp bends, causing it to cross over the yellow line
- Drive on the wrong side of the road
- Commit thousands of driving errors
Tesla Full Self-Driving Will Run Down a Child in a School Crosswalk
The Dawn Project publicly reported in August that our test engineers had shown that Full Self-Driving would run over a child in a crosswalk. Recently The Dawn Project tested Full Self-Driving (10.69.2.2) and found that it will still run down a child in a school crosswalk.
Three months have gone by and Tesla engineering still hasn’t fixed a problem that nearly everyone agrees should ban it from our public roads. To read Dan’s full thoughts on the actions of Tesla’s engineering team, click here.
There have been thousands of instances of dangerous mistakes made by Tesla Full-Self driving software on public roads, as reported by the Tesla owners themselves:
We have offered to let the national press, regulators, Tesla and Elon Musk himself sit in the Tesla as we test Tesla’s Full Self-Driving software. We have challenged Tesla to reveal its testing protocols and results that would contradict our findings, but they have not done so.
Instead of replicating our tests, Tesla has claimed our findings are faked. Nobody has ever carefully replicated our tests and shared their results. If they had done so they would have gotten the same results.
Rather than addressing the critical safety issues that we revealed, Tesla instead attempted to silence The Dawn Project by issuing a cease and desist letter. It is deeply ironic that despite lauding himself as a free speech absolutist, Elon Musk attempted to take legal action to silence a safety advocacy group. They accused us of putting consumers at risk, when The Dawn Project is working to highlight critical safety defects in Tesla’s Full Self-Driving software. To read Dan’s full response to Tesla’s cease and desist letter, click here.
Elon Musk has made false promises every year for the past eight years regarding Tesla Full Self-Driving’s progress and its readiness for public use. Yet he has failed to address the critical safety problems in Tesla’s Full Self-Driving software product.
The videos below contain summaries of each of The Dawn Project’s safety tests, as well as key findings from each test. Further information can be accessed through the links under each video.
All of the safety tests were conducted using the latest version of Tesla Full Self-Driving Beta software available at the time of testing. On each run, the internal camera shows the viewer that Full Self-Driving is clearly engaged. An LED flashlight was used to illuminate the accelerator pedal to show it was not pressed during each of the safety tests. Please note that the LED flashlight created a strobe effect with the in-car camera, which causes it appear to flash on the camera.
Tesla Full Self-Driving Fails To Obey ‘Road Closed’ And ‘Do Not Enter’ Signs
Tesla Full Self-Driving will ignore ‘Road Closed’ and ‘Do Not Enter’ signs.
In five safety tests, Tesla Full Self-Driving failed to obey these critical road safety signs.
For more information on this test, please click here.
Tesla Full Self-Driving Will Drive Around Stopped School Buses and Speed in School Zones
Tesla Full Self-Driving will drive around stopped school buses with their stop sign arm extended and lights flashing, and it fails to obey school zone speed limits.
For more information on this test, please click here.
Tesla Full Self-Driving Will Hit Children in Strollers
Tesla Full Self-Driving will repeatedly hit a child mannequin in a stroller.
For more information on this test, please click here.
Tesla Full Self-Driving Still Runs Over Children Crossing the Road
Please see below video footage of Tesla Full Self-Driving running down a child mannequin in various situations. For more information on this test, please click here.
Tesla Full Self-Driving Nearly Killed our Founder
Tesla Full Self-Driving also nearly killed our Founder, Dan O’Dowd, when it swerved into the path of an oncoming BMW, giving the driver less than a second to react and avoid a head-on collision.
The Tesla ignored a road sign warning it to slow down, and failed to recognise the oncoming vehicle until the BMW was only about 70 feet away. The Tesla started to turn left into the path of the oncoming vehicle, which was less than 60 feet away.
The entire time between the BMW first appearing and passing was 1.05 seconds. The driver had only 0.65 seconds to stop Tesla’s defective software from causing a potentially fatal crash.
Tesla Full Self-Driving Ignores Other Important Road Signs
Tesla Full Self-Driving ignores road signs warning it to slow down for sharp bends, causing it to cross the yellow line onto the wrong side of the road.
Tesla Full Self-Driving Drives on the Wrong Side of the Road
Tesla Full Self-Driving will cross double yellow lines and drive on the wrong side of the road.
“The Wrong Thing at the Worst Time”
Tesla’s Full Self-Driving software warns you when you activate it that it “may do the wrong thing at the worst time”. That is an unacceptable disclaimer for an autonomous vehicle. For instance, it may cross the yellow line – the wrong thing – when an oncoming vehicle is speeding towards it – the worst time.
The Response to Reports of Dangerous Safety Defects
“That’s Normal”
Omar Qazi (@WholeMarsBlog) is Tesla’s leading advocate for Full Self-Driving on social media. He was identified by The Wall Street Journal as the person whose tweets Elon Musk responds to most often. He frequently reports long drives using Tesla’s Full Self-Driving without any failures. He even recruited a real child to stand in front of his Tesla to test whether or not it would run them over. He routinely insults anyone on Twitter who reports safety defects in Full Self-Driving.
On 19 October 2022, Tesla Full Self-Driving user Ross Smith detailed his experiences of the software, stating that: “Beta tried to kill me by making a left hand turn against oncoming traffic“. Tesla’s most prominent and vocal Full Self-Driving supporter, Omar Qazi, replied: “That’s normal. ”
that’s normal. it will get better
— Whole Mars Catalog (@WholeMarsBlog) October 19, 2022
Exactly one month later, Qazi complained about Tesla Full Self-Driving, calling the latest update, 10.69.3.1, first stating that “it sucks” and later calling it “a stinker“.
first impressions of fsd beta 10.69.3.1: it sucks @elonmusk
— Whole Mars Catalog (@WholeMarsBlog) November 19, 2022
this update is a stinker so far😂
— Whole Mars Catalog (@WholeMarsBlog) November 19, 2022
Everyone agrees that Tesla Full Self-Driving will try to kill you. After The Dawn Project publicized the shocking findings of the safety tests, Dan O’Dowd publicly called on Elon Musk on 21 November 2022 to delay the wide release of this software until these safety defects were fixed.
Three days after his public criticism of Full Self-Driving, Omar Qazi warned new users of the software : “If you’re trying Tesla Full Self-Driving Beta for the first time, it’s important to remember that it will at some point randomly try to kill you. This is a when, not an if.” We agree.
If you're trying Tesla Full Self-Driving Beta for the first time, it's important to remember that it will at some point randomly try to kill you. This is a when, not an if.
As such you must keep your hand on the wheel and be ready to takeover at any time. You must pay attention
— Whole Mars Catalog (@WholeMarsBlog) November 22, 2022
Qazi’s remarks about the latest 10.69.3.1 version of the software were made on November 22, 2022. These warnings were ignored by Elon Musk, who two days later on on November 24 2022, announced on Twitter that Tesla’s 10.69.3.1 Full Self-Driving software “is now available to anyone in North America”. The version of the software that Elon made “available to anyone” was the same version that was heavily criticised, including by Whole Mars, above.
Our tests show that the driver has less than one second to override Tesla Full Self-Driving when it commits a critical driving error on average every 8 minutes; the driver has less than one second to prevent Full Self-Driving from causing an accident every 36 minutes; and the driver has less than one second to prevent Full Self-Driving from killing someone every few hours. So why has Elon Musk made this defective software available to “anyone in North America”?
Tesla Full Self-Driving Beta is now available to anyone in North America who requests it from the car screen, assuming you have bought this option.
Congrats to Tesla Autopilot/AI team on achieving a major milestone!
— Elon Musk (@elonmusk) November 24, 2022
Click here to read Dan’s response to Elon Musk’s wide release of Tesla’s dangerous Full Self-Driving software.
Twelve days after Musk’s announcement, Omar Qazi again took to Twitter to publicly complain about problems with Tesla’s Full Self-Driving software, saying that he is “kind of sick of FSD Beta.” He stated that he was “going to spend a week going back to driving manually”, and that not using Tesla Full Self-Driving is “so much faster, more efficient, and more comfortable that it’s not even funny”. Qazi further criticized the software, saying: “FSD Beta is like having a moron with three brain cells driving your car.”
To be honest i’m kind of sick of FSD Beta right now.
I’m going to spend a week going back to driving manually. I am so much faster, more efficient, and more comfortable that it’s not even funny.
FSD Beta is like having a moron with three brain cells drive your car @elonmusk
— Whole Mars Catalog (@WholeMarsBlog) December 6, 2022
Should A Self-Driving Car That Will Try To Kill People Be Allowed On Our Public Roads?
Everyone agrees, advocates and detractors alike, that Full Self-Driving requires a fully attentive human supervisor ready to take control immediately when the Tesla randomly tries to kill someone.
Elon Musk appears to operate on the policy that it is acceptable to release a self-driving car to “anyone in North America”, that will “randomly try to kill you”. Tesla and its fans believe that a Full Self-Driving car that: will run down a child; ignores ‘Do Not Enter’ and ‘Road Closed’ signs; overtakes stopped school buses with their red lights flashing; speeds in school zones when children are present and will hit a child in a stroller is ready for wide release as a commercial product.
We are all in agreement on the facts – Tesla’s Full Self-Driving software will try to kill you. The difference lies in attitudes towards policy: should a software product that will “randomly try to kill you” be released to 400,000 members of the public, or is that an unacceptable risk warranting an immediate recall of the defective product?
The Dawn Project believes that this is unacceptable. The public supports this view – our surveys revealed that 93% of registered voters agree that a Full Self-Driving car that would run over a child in a crosswalk must be banned from our roads immediately. The defective Tesla Full Self-Driving software should be banned from our roads immediately.
Elon Musk and his fans are part of the minority of the population who are drawn into high-risk activities. These are the people who jump out of airplanes, pay $44 billion for a money losing company you don’t want to own, or ride in a unfinished engineering prototype self-driving car which its greatest advocates say “sucks”; “is a stinker”; “it’s normal” for it to try to kill you “while making a left hand turn against oncoming traffic”; “may do the wrong thing at the worst time,” and “will at some point try to kill you.”
The question we must ask ourselves is simple – should a software product that will run over children be allowed on, or banned from, our public roads?
Tesla Full Self-Driving has not had to pass any government inspections or testing. Regulators must act to remove Tesla’s dangerous Full Self-Driving software product from the market. All autonomous vehicles must be subjected to rigorous safety testing before they are made available to the public, and until Tesla conclusively demonstrates that their many safety defects have been fixed it must be taken off our roads.
Tesla’s policy is to keep their software on the road while they claim they are fixing bugs – this is wrong and puts lives at risk. It should be banned immediately.