On Sunday February 11th, 2024 The Dawn Project broadcasted a public safety message during Super Bowl LVIII demonstrating the dangers of Tesla’s self-driving software, highlighting how Tesla has endangered road users by allowing Tesla’s self-driving software to activate on roads where Tesla knows it could not operate safely, resulting in multiple fatalities. 

The Dawn Project’s Super Bowl commercial features footage from several fatal crashes involving Tesla’s Autopilot software on roads where it was not intended to operate, and can be found below:

The Dawn Project’s public safety message follows an investigation from The Washington Post revealing the shocking death toll of Tesla Autopilot outside of the Operational Design Domain of the software, resulting in “at least eight fatal or serious wrecks involving Tesla Autopilot on roads where the driver assistance software could not reliably operate” according to The Post.

We demand that Tesla is held to account over these deaths, and makes sure that Autopilot is restricted to controlled-access highways. The National Highway Traffic Safety Administration (NHTSA) must intervene to keep road users safe, and ban Tesla’s self-driving software until it has fixed the defects we have identified. 

The Dawn Project is a safety advocacy group founded in 2021 by secure software expert Dan O’Dowd campaigning to make computers safe for humanity. The first danger the group is tackling is the reckless deployment of Tesla’s self-driving software on public roads, which regulators have revealed to have been involved in over 1,000 crashes.

Further information about The Dawn Project’s most recent campaign ad to “call foul” on Tesla’s self-driving software can be found below:

What is Tesla’s self-driving software?

Tesla currently sells two self-driving software products for its cars: Autopilot and Full Self-Driving (Beta). Both Tesla Autopilot and Full Self-Driving are regulated as a Level 2 Advanced Driver Assistance System (ADAS), which, according to the Society of Automotive Engineers, the global professional association and standards organization, means that the driver is responsible for the vehicle’s actions, even when its driver assistance features are operational and engaged.

All Tesla vehicles manufactured since April 2019 contain the inbuilt hardware necessary to support Full Self-Driving, and older models can be upgraded. Once a Tesla owner purchases Full Self-Driving, which costs $12,000 or $200 per month, the car will download the software and Full Self-Driving will become available.

Full Self-Driving will steer, accelerate, brake and navigate to a destination set by the driver. When the software is activated, the driver is not driving the car – rather Tesla states that they must supervise Full Self-Driving by remaining alert and being ready to take over. Tesla also warns in a lengthy disclaimer that the software “may do the wrong thing at the worst time.” At present, there are an estimated 400,000 Tesla vehicles equipped with Full Self-Driving.

What is Autopilot’s ‘Operational Design Domain’, and why does it matter?

Whereas Full Self-Driving can be activated on city streets, Tesla Autopilot is “intended for use on controlled-access highways”, i.e. freeways, “with a fully attentive driver”, according to Tesla’s owner manual

However, recent tests conducted by The Dawn Project have revealed that Autopilot can be engaged and activated on roads it is not designed to be used on. 

During testing on suburban roads in Santa Barbara County, California, The Dawn Project also found the below safety defects in Autosteer:

Autopilot can also still be engaged on many roads with lane markings, and is not restricted to the “controlled-access highways” that Tesla states it is designed to be used on.

Crashes and Deaths have occurred outside of Autopilot’s Operational Design Domain

This means that Tesla’s self-driving software can activate outside of its Operational Design Domain, a term describing the context in which a software can be considered safe to use. 

This defect was identified in the lawsuit following a crash involving Dillon Angulo in Key Largo, Florida, in 2019, where a self-driving Tesla blew past a stop sign before crashing into a parked vehicle through a T intersection:

By allowing Tesla’s dangerous and defective software to operate outside of its Operational Design Domain, millions of road users are endangered on a daily basis, and tragically lives have been lost as a result of this negligence and contempt for public safety. 

The Angulo crash, however, is not an isolated example of Autopilot crashes on roads where it was not safe for the software to operate. 

On May 7th, 2016, Joshua Brown was killed when his Tesla Model S sedan collided with a tractor-trailer that was crossing the Tesla’s path on US Highway 27A, near Williston, Florida:

Following the fatal collision, the National Transportation Safety Board (NTSB) investigated the collision and demanded that Tesla “incorporate safeguards that limit use of automated vehicle control systems to those conditions for which they were designed”.

Three years later, Jeremy Banner died when his Tesla Model 3 collided with a semi-truck at approximately 70mph, in Delray Beach, Florida, while also operating on Autopilot mode:

Banner’s Tesla collided with a semi-truck, shearing off its roof as it slid under the truck’s trailer. He was tragically killed on impact. 

Once again, the NTSB investigated the collision and found that Tesla Autopilot “allows a driver to activate the partial driving automation systems at locations and under circumstances for which their use is not appropriate, safe, or included in the manufacturer’s design, such as roadways that have cross traffic”. 

Most recently, teenager Tillman Mitchell sustained severe injuries when he was struck down by a Tesla allegedly operating on Autopilot, as he stepped off a school bus in North Carolina in March 2023. 

According to police reports obtained by The Washington Post, the school bus had its stop sign extended and lights flashing when the Tesla Model Y ran down Mitchell on North Carolina Highway 561.

Why the Super Bowl?

We believe the public safety epidemic engendered by Elon Musk and Tesla’s reckless deployment of millions of self-driving Teslas should concern us all.

That is why we are broadcasting our message during the Super Bowl, to inform and educate as many Americans as possible about this vital public safety issue.

Too many Americans do not know the true extent of Tesla’s self-driving death toll, which regulators have shown to be involved in over 1,000 crashes, 27 of which have been fatal. 

At The Dawn Project, our mission is to inform the American public of this concerning issue which impacts all road users. We demand that the software running self-driving cars be the best software ever made, rather than the worst. 

In 2023, we informed millions of viewers of Super Bowl LVII of the dangers of Tesla Full Self-Driving. The ad, which can be viewed here, publicized the findings of safety tests conducted by The Dawn Project which uncovered the following critical safety defects in Tesla’s Full Self-Driving, which will:

The Dawn Project’s ad was broadcast during the game to millions of viewers nationwide, including to politicians and regulators in Washington D.C, and state capitals including California, New York, Texas, Florida and Georgia. The ad urged  NHTSA and Department of Motor Vehicles (DMV) to immediately turn off Tesla’s Full Self-Driving software until all of the critical safety defects we, and others, have identified are fixed. 

The scale of the problem and The Dawn Project’s campaign

That Tesla allows Autopilot to be engaged on roads where it is not intended to operate demonstrates Tesla’s blatant disregard for public safety. 

People have died because of this disregard for road users and the American public.

The Dawn Project is campaigning to ensure that safety-critical software, such as that which powers self-driving vehicles, is held to the very highest standards of safety, security and reliability. 

By allowing Autopilot to operate on non-controlled access freeways, Tesla is failing to protect road users by ensuring that their software operates within its Operational Design Domain. 

The purpose of our Super Bowl ad this year is to raise awareness that Tesla’s Autopilot has been involved in fatal collisions where it was not intended to be used. 

This must stop.

We demand that Tesla is held to account over these deaths, and makes sure that Autopilot is restricted to controlled-access highways. NHTSA must intervene to keep road users safe, and ban Tesla’s self-driving software until it has fixed the defects we have identified. 

The Dawn Project has been extensively testing Tesla Full Self-Driving since June 2022. The group was founded in 2021 by software security expert Dan O’Dowd, a software security expert who develops safety-critical systems for NASA, Boeing and the US military.