SANTA BARBARA, CA – Public safety advocacy group The Dawn Project recently conducted live streamed and recorded safety tests in collaboration with major Tesla investor Ross Gerber, revealing alarming safety defects in Tesla’s Full Self-Driving (FSD) software. The latest findings, which were streamed live, showed Tesla FSD drive through a stop sign and almost cause a collision during a real-world test in Santa Barbara, CA.
In response to Dan O’Dowd’s harsh criticisms of the dangers of Tesla Full Self-Driving, Ross challenged Dan to a ride-along in Santa Barbara in Ross’ Full Self-Driving Tesla with Ross supervising. Ross offered to demonstrate the capabilities and safety features of Tesla’s Full Self-Driving Beta software, in order to prove that The Dawn Project’s findings of critical safety defects in the software were erroneous, and that Tesla’s Full Self-Driving software is safe to be sold to the general public.
Ahead of the event, hundreds of Tesla supporters pressured Ross not to proceed with the live streamed Challenge, because they know that Tesla Full Self-Driving would likely try to kill somebody during a one hour leisurely drive. This is what they and Tesla are concealing from the public.
Prior to the tests, Founder of The Dawn Project Dan O’Dowd and Ross Gerber took part in a point-by-point debate which laid out their respective views on the necessary practical, technological and regulatory precautions to successfully develop autonomous driving systems. Their debate was live streamed on YouTube, and a full HD recording will be made available shortly.
Their debate continued during a test drive with Ross supervising his Full Self-Driving Tesla on a route he selected to demonstrate the capabilities and safety features of Tesla FSD’s latest version (11.4.4). The participants experienced the technology in real-world scenarios while debating the performance of the software and its safety features.
During the FSD Challenge, the software blew past a stop sign at 35mph, forcing Ross Gerber to slam on the brakes, narrowly avoiding a collision with an oncoming vehicle. A clip of the incident can be found here and below:
In his assessment of Tesla Full Self-Driving’s performance during the ride along, Dan O’Dowd commented: “The results of our leisurely drive corroborate The Dawn Project’s previous findings, which show that Tesla’s Full Self-Driving software sometimes does not register basic safety critical road signs. For months The Dawn Project’s safety tests have been disparaged by Tesla and its supporters in a concerted effort to discredit our campaign. We have highlighted these dangers for nearly a year, and rather than fixing them, Tesla has claimed that our tests are fake. My acceptance of this challenge was to demonstrate that our tests are genuine and my safety concerns are real. Were it not for Ross Gerber’s quick reaction in slamming on the brakes, FSD would have put us all in the hospital, turning our one hour leisurely drive into a tragedy. A product that tries to kill someone every hour should never be sold to consumers, let alone be allowed onto our public roads.”
To watch a recording of the live stream, please see here.
For a full compendium of The Dawn Project’s previous testing of FSD’s 11.3.6 and 11.4.4 versions conducted in June 2023, please see here.
An extended clip of Tesla’s Full Self-Driving software blowing a stop sign at 35mph during Dan and Ross’ ride-along, is available here.
The Dawn Project continues to test the latest version of Tesla Full Self-Driving and will publish the results of further safety tests over the coming weeks.