Following our first New York Times ad in January 2022, Founder of The Dawn Project, software safety expert Dan O’Dowd, ran for Senate in California in April that year on the single issue of raising awareness surrounding the dangers of Tesla’s Full Self-Driving software, launching a series of campaign ads nationwide to draw attention to this urgent issue. 

In Summer 2022, The Dawn Project conducted a series of safety tests of Tesla’s Full Self-Driving software. We published the results as part of a nationwide television advertising campaign, calling on Congress to shut Tesla Full Self-Driving down after our tests found that it will run down a child-sized mannequin in the road.

Tesla did not fix the defect. Instead, they threatened The Dawn Project with litigation claiming that our tests were staged, instead of addressing the underlying critical defects our safety tests had identified. You can read Dan O’Dowd’s response to Tesla’s Cease and Desist letter here.

Tesla knows about many of these defects because we and other third party testers have told them. In fact, drivers have reported thousands of safety critical defects in Tesla’s Full Self-Driving software, which Tesla has failed to fix.

Following our series of safety tests, we decided to publish a series of full-page ads in The New York Times identifying specific defects in Tesla’s Full Self-Driving software. We addressed each of these full-page ads to Tesla, calling upon the automaker to fix these problems. The ads were published alongside safety tests conducted by The Dawn Project, which showed that Tesla Full Self-Driving will:

We publicly warned Tesla that we had identified the above defects, though with each software update to Tesla Full Self-Driving we observed that these issues had not been resolved.

So, we decided to launch a Super Bowl ad in February 2023, drawing upon the tests included in our New York Times ad series, bringing these defects to an even wider audience. 

Despite us repeatedly warning Tesla of these critical safety defects, in a major newspaper and in the largest advertising event in the world, Tesla did nothing.

Tragically, one month after our Super Bowl ad was broadcast, a school child was run down by a Tesla blowing past a stopped school bus in North Carolina, putting him on a ventilator with a fractured neck and broken leg. This was exactly the type of tragic scenario we had warned about. To this day, Tesla’s self-driving software fails to stop for a school bus when pedestrians are exiting, which we demonstrated to politicians in Summer 2023.

We also tested Tesla’s driver monitoring system, which, as a Level 2 ADAS software, must ensure that the driver remains alert at all times. After all, Tesla warns users that the software may “do the wrong thing at the worst time”, which means that the system designed to ensure the driver is indeed paying attention is attentive and fully alert.

Tesla’s in-cabin driver monitoring system. A camera above the rear view mirror is meant to monitor the driver’s face and eyes, to check that they are paying attention to the road. Sensors in the steering wheel are supposed to check that the driver is applying force to the wheel. 

What we found was that Tesla’s driver monitoring system is woefully inadequate, and fails to ensure that the driver is monitoring the system effectively. We told the press, who also found serious failures in Tesla’s system, after The Dawn Project’s tests showed that a teddy bear, balloon and unicorn could operate Tesla’s self-driving software, despite the fact that regulators demand that there is an attentive driver ready to intervene at any moment.

We have told Tesla. We have told regulators. We have told politicians. 

Now we are telling you.

We are telling you, in no uncertain terms, that Tesla has failed to fix these serious safety defects and their blatant disregard for public safety by allowing their self-driving software on public roads, means that a boycott of Tesla is now necessary.

The US public deserves to be safe, and we demand that Tesla protect road users by removing their self-driving software from public roads immediately, before more people are killed. 

Official statistics are alarming. Over 1,000 crashes, 27 of which have been fatal, claiming 33 lives, have been recorded since 2021 by the Department of Transport. Only one other fatality has been recorded by any other manufacturer, and Tesla counts more crashes than all other manufacturers combined.