Tesla’s Full Self-Driving Beta software was recently put to the test in a widely-viewed video by YouTuber and Full Self-Driving Beta tester, CYBRLFT.
The test involved putting Full Self-Driving (FSD) through its paces on public roads to test FSD’s capabilities across a number of different driving scenarios. The test ran the latest version of Tesla Full Self-Driving at the time of filming, FSD Beta 10.69.2.
The full video was published by CYBRLFT on 14 September 2022, and can be found here.
The test was conducted over three runs of ‘The Gauntlet’, a course designed to test FSD’s capabilities over a range of different city street driving scenarios. During the test run, the driver only intervened to avoid collisions with other road users, and allowed the Tesla to make errors that would not result in a serious accident.
During the third run, which occurs between 20:13 – 27:14 of the video, the Tesla committed a series of critical driving errors. A summary of these is below:
Time | Type of incident | Description | Quote from driver |
20:31 | Malfunction: no intervention | Swerves towards parked vehicle | “What are you doing? Wow, how drastically it wanted to avoid that truck. That was so dramatic.” |
21:06 | Malfunction: no intervention | Swerves towards parked vehicle to avoid oncoming vehicle | “See, we are really overreacting with oncoming vehicles… That I don’t like. That’s unnecessary.” |
21:28 | Disengagement | Attempts to manoeuvre around stationary vehicle waiting at intersection, and move into oncoming traffic | “Don’t you start thinking, this is an obstacle…Look at this. I’m going to have to take over, because you’re just going to try to go around…See that? That’s just sad” |
22:02 | Malfunction: no intervention | Failing to commit to left hand turn causing confusion to other drivers | “All right… Let’s see. Come on… Come on… Let’s see.. Let’s see… Let’s see… Get in there. Get in there and commit to it. Commit to it… Commit to it, damn it. Holy crap…This blue car behind me probably thinks I’m drunk. “ |
22:08 | Malfunction: no intervention | Enters on ramp too quickly | “Way too fast. What are you doing? Oh, man. We have lost any sort of finesse around that corner.” |
24:46 | Disengagement | Passes speed limit sign of 40mph and accelerates to 51 mph (82 pkh) | “Like, oh my goodness, right now we had a speed limit issue, which that one, while I’m thinking about, I’m going to go ahead and report that… It was still detecting highway speeds… That’s a first. I’ve not seen that. That’s not good.” |
In 7 minutes of driving, the driver was forced to disengage on two occasions, equating to one disengagement every 3.5 minutes (or 210 seconds).
The driver was forced to disengage on one occasion where the Tesla attempted to maneuver around a stationary vehicle in the turn lane and swerve into oncoming traffic. On the second occasion, the Tesla failed to read a 40mph speed limit sign and instead continued to follow the highway speed limit, accelerating to 51mph (82 kph).
Tesla Full Self-Driving also committed four driving errors which would have confused other road users and potentially endangered those around it. This means that Tesla Full Self-Driving committed a serious driving error every 1.8 minutes during this test.
Who would purchase a product that malfunctions every two minutes, and puts the safety of its driver and other road users at serious risk every 3.5 minutes? Why should we put up with it in a Full Self-Driving vehicle?
With potentially millions of lives on the line, Full Self-Driving software should be the best commercial software ever sold – not the worst!