On Sunday 11 February 2024 The Dawn Project broadcasted a public safety message during Super Bowl LVIII demonstrating the dangers of Tesla’s self-driving software, highlighting how Tesla endangers schoolchildren by failing to address a critical safety defect in its self-driving software whereby Tesla’s self-driving software will overtake stopped school buses. 

The Dawn Project’s Super Bowl ad features footage relating to an accident in Halifax County, North Carolina, in which 17 year old Tillman Mitchell was struck by a Tesla reportedly operating in Autopilot mode as he was exiting a school bus. The ad can be viewed below: 

The Dawn Project’s public safety announcement follows The Washington Post reporting on the March 15, 2023 crash which hospitalised Mitchell. The crash came just a month after The Dawn Project broadcasted a public safety announcement in Super Bowl LVII, warning that Tesla’s self-driving software would overtake stopped school buses and would run down a child crossing the road. Four months earlier, the safety advocacy group warned of the issue in a full page New York Times ad

The Dawn Project demands that Tesla takes urgent action to fix this critical safety defect, and urges the National Highway Traffic Safety Administration to immediately recall Tesla’s self-driving software until it obeys traffic laws. 

The Dawn Project is a safety advocacy group founded in 2021 by secure software expert Dan O’Dowd campaigning to make computers safe for humanity. The first danger the group is tackling is the reckless deployment of Tesla’s self-driving software on public roads, which regulators have revealed to have been involved in over 1,000 crashes.

Further information about The Dawn Project’s most recent campaign ad to “call foul” on Tesla’s self-driving software can be found below:

What is Tesla’s self-driving software?

Tesla currently sells two self-driving software products for its cars: Autopilot and Full Self-Driving (Beta). Both Tesla Autopilot and Full Self-Driving are regulated as a Level 2 Advanced Driver Assistance System (ADAS), which, according to the Society of Automotive Engineers, the global professional association and standards organization, means that the driver is responsible for the vehicle’s actions, even when its driver assistance features are operational and engaged.

However, Full Self-Driving will steer, accelerate, brake and navigate to a destination set by the driver. When the software is activated, the driver is not driving the car – rather Tesla states that they must supervise Full Self-Driving by remaining alert and being ready to take over. Tesla also warns in a lengthy disclaimer that the software “may do the wrong thing at the worst time.” At present, there are an estimated 400,000 Tesla vehicles equipped with Full Self-Driving.

All Tesla vehicles manufactured since April 2019 contain the inbuilt hardware necessary to support Full Self-Driving, and older models can be upgraded. Once a Tesla owner purchases Full Self-Driving, which costs $12,000 or $200 per month, the car will download the software and Full Self-Driving will become available.

What does the law say about passing stopped school buses?

It is illegal in every state in the US to pass a stopped school bus with its red lights flashing and stop arm extended. 

Self-driving systems must obey all traffic laws to ensure the safety of public roads. The Dawn Project finds it unacceptable for Tesla to deploy over 400,000 self-driving vehicles on public roads which endanger children. 

What has The Dawn Projected previously warned about regarding Tesla Full Self-Driving and stopped school buses?

The Dawn Project has repeatedly warned about the dangerous interaction between Tesla Full Self-Driving and stopped school buses. Click here to read about our New York Times ad, published in November 2022, which warned Tesla that its self-driving technology would blow past stopped school buses.

Why the Super Bowl?

We believe the public safety epidemic engendered by Elon Musk and Tesla’s reckless deployment of millions of self-driving Teslas should concern us all.

That is why we are broadcasting our message during the Super Bowl, to inform and educate as many Americans as possible about this vital public safety issue.

Too many Americans do not know the true extent of Tesla’s self-driving death toll, which regulators have shown to be involved in over 1,000 crashes, 27 of which have been fatal. 

At The Dawn Project, our mission is to inform the American public of this concerning issue which impacts all road users. We demand that the software running self-driving cars be the best software ever made, rather than the worst. 

In 2023, we informed millions of viewers of Super Bowl LVII of the dangers of Tesla Full Self-Driving. The ad, which can be viewed here, publicized the findings of safety tests conducted by The Dawn Project which uncovered the following critical safety defects in Tesla’s Full Self-Driving, which will:

The Dawn Project’s ad was broadcast during the game to millions of viewers nationwide, including to politicians and regulators in Washington D.C, and state capitals including California, New York, Texas, Florida and Georgia. The ad urged  NHTSA and Department of Motor Vehicles (DMV) to immediately turn off Tesla’s Full Self-Driving software until all of the critical safety defects we, and others, have identified are fixed. 

The scale of the problem and The Dawn Project’s campaign

That Tesla allows Autopilot to be engaged on roads where it is not intended to operate demonstrates Tesla’s blatant disregard for public safety. 

People have died because of this disregard for road users and the American public.

The Dawn Project is campaigning to ensure that safety-critical software, such as that which powers self-driving vehicles, is held to the very highest standards of safety, security and reliability. 

By allowing Autopilot to operate on non-controlled access freeways, Tesla is failing to protect road users by ensuring that their software operates within its Operational Design Domain. 

The purpose of our Super Bowl ad this year is to raise awareness that Tesla’s Autopilot has been involved in fatal collisions where it was not intended to be used. 

This must stop.

We demand that Tesla is held to account over these deaths, and makes sure that Autopilot is restricted to controlled-access highways. NHTSA must intervene to keep road users safe, and ban Tesla’s self-driving software until it has fixed the defects we have identified. 

The Dawn Project has been extensively testing Tesla Full Self-Driving since June 2022. The group was founded in 2021 by software security expert Dan O’Dowd, a software security expert who develops safety-critical systems for NASA, Boeing and the US military.