Anyone who doesn’t own Tesla stock can see that Full Self-Driving, with a human driver correcting its worst errors, is worse than a drunk driver. In fact, drivers have reported thousands of critical safety defects in Tesla’s Full Self-Driving software.

Despite this, Tesla fanboys continue to claim that Full Self-Driving, in conjunction with an alert driver to supervise the software, is safer than a human driver. This claim is seemingly corroborated by Tesla’s own safety statistics, testifying to FSD’s safety.

Tesla’s fans argue that as a driver assistance system, the combination of Full Self-Driving and an alert human driver surpasses the safety of a human driver alone. However, Tesla Full Self-Driving does not function as a driver assistance system, but rather takes away control from the human driver. Full Self-Driving cannot assist the driver in any way, so it is not an Advanced Driver Assistance System. Musk’s own words support this, as he recently responded to former Autopilot Vision at Tesla Andrej Karpathy that most people on earth are not aware that “Tesla FSD” vehicles “can drive themselves”:

Tesla’s expansion of its Full Self-Driving project relies on “statistics” that claim FSD is safer than an average driver, which fanboys rely upon when attempting to address valid criticism directed at FSD.

However, these safety statistics were fabricated in the same propaganda factory that:

  • Ordered engineers to create the fake FSD demonstration video.

In October 2016, Elon Musk ordered his self-driving team to develop a slick demo video of a self-driving car driving a typical employee commute to Tesla headquarters.  In an email, Musk said, “Just want to be absolutely clear that everyone’s top priority is achieving an amazing Autopilot demo drive”. “Since this is a demo, it is fine to hardcode some of it, since we will backfill with production code later.”  After viewing a version of the video Elon Musk said that there were still too many jump cuts, and that the demo footage “needs to feel like one continuous take.”

Musk told his team, “I will be telling the world that this is what the car *will* be able to do, not that it can do this upon receipt.”

During a deposition in June 2022, Ashok Elluswamy, the current director of Autopilot software, was asked whether the video accurately reflected the capabilities of Autopilot at the time of its release.

He responded, “The intent of the video was not to accurately portray what was available for customers in 2016, it was to portray what was possible.”

Elluswamy also admitted that when Tesla were filming the video, Autopilot crashed “into a fence in our parking lot.”

But Elon Musk had other plans for the video. According to internal emails, when the video was finally up to Elon Musk’s standards, he dictated that the following opening text be added to the video:

The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.

In October 2016, Elon tweeted that the video showed a Tesla “driving itself” with “no human input at all”, linking to the video of the journey on the Tesla website, which has now been delisted:

The video is not one continuous take (it only looks that way).  It was assembled from short clips made during 3 days of shooting over 500 miles of driving (avoiding heavy traffic). In that time there were 182 disengagements when the car told the driver to take over or the driver took over to prevent an accident!  

When asked if the Tesla drove up over a curb, through bushes and hit a fence, Elluswamy testified: “I’m not sure about the curb or the bush, I do know about the fence.”  The film of that ended up on the cutting room floor.

Tesla and Musk did not disclose when releasing the video that engineers had created a three-dimensional map for the route the Model X took, Elluswamy said during his deposition.  Musk said years after the demo that the company doesn’t rely on high-definition maps for automated driving systems, and argues systems that do are less able to adapt to their surroundings.  This drive depended on mapping technology that has never been in any Tesla production cars.

One YouTuber, former Tesla employee AI Addict, recently recreated the Paint It Black video in order to determine whether FSD Beta could handle the same route over six years later. The recreation showed that Tesla still couldn’t handle the basic route it was purported to have driven all those years ago, even blowing past a yield sign after just two minutes, causing the driver to exclaim “Oh God, f*ck!”

FSD from its outset until today is based firmly on a foundation of lies and propaganda, despite Elon’s public claims at the time that the video was indeed an actual reflection of FSD’s capabilities.

  • Ordered employees to lie to customers about range.

Last month, a report revealed that Tesla employees were under instructions to thwart any customers who complained about their vehicles not being able to achieve the travel the full distance stated by the range estimate provided by their vehicle.

It was also reported that Tesla had established a “Diversion Team” to deny service appointments for customers complaining about the range of their vehicles. At Tesla’s Nevada base, the report uncovered concerning examples of employees celebrating cancelling appointments for customers with complaints about the their vehicle’s range by striking a xylophone, provoking outbursts of applause from other Tesla workers, who reportedly were ranked internally according to how many service appointments they cancelled.

  • Ordered customer support to not record customer complaints about unintended acceleration and phantom braking.

Another recent landmark investigation revealed that Tesla’s track record for impeding legitimate complaints about their products extended beyond exaggerating driving range, but also incorporating cases of unintended acceleration and phantom braking.

Instances of unintended acceleration have become widely reported, and the Office of Defects Investigation (ODI) recently received a petition to re-evaluation of its decision to open a defect investigation into unintended acceleration cases in Tesla vehicles. The petition referred to open-source analysis of the Tesla Model 3 inverter design, which showed that “negative spikes in Tesla’s low-voltage system can be interpreted as a full acceleration command even though the driver did not touch the accelerator”. In February 2022, the National Highway Traffic Safety Administration opened an investigation into complaints relating to “phantom braking”, after a spike in complaints.

Files retrieved as part of the ‘Tesla Files’ by German publication Handelsblatt documented that Tesla received more than 2,400 complaints alleging instances of unintended acceleration in its vehicles. Evidence from internal sources also revealed that Tesla customer service employees were instructed to handle such complaints verbally, without committing details of consumers’ reports of cases of unintended acceleration and phantom braking to writing. Information obtained by Handelsblatt showed that, when dealing with such complaints, employees were instructed: “Do not copy the report below into an email, text message or leave it in a voicemail to the customer”.

  • Promised that a Model 3 bought in 2019 would not only not depreciate but would increase in value to a couple hundred thousand dollars.

In April 2019, Elon Musk publicly pronounced on Twitter that, due to the increase in the underlying value of its Full Self-Driving package, the value of its vehicles would appreciate over time.

He later argued that, in the future, vehicles purchased in 2019 would be “worth $100k to $200k”, with the Model 3 valued at approximately “$75k”.

Since Musk’s statements, the price of Tesla’s Full Self-Driving package has now increased to $15,000, while its Model 3 has been the subject of a series of price cuts, and currently start from $40,240.

  • Said in 2016 that a Model S and Model X can drive autonomously safer than a human.

In 2016, Elon Musk told an audience that a Tesla Model S and Model X can “at this point, drive autonomously with greater safety than a person”. Seven years later, Tesla still warn that their self-driving software “may do the wrong thing at the worst time”, confirmed by at least 23 fatalities and 840 accidents attributed to its ADAS technology in NHTSA’s SGO data.

  • Promised that a Model 3 would operate for a million miles with minimal maintenance at an all in cost of one-third the per mile cost of a comparable gasoline powered car.

In April 2019, Musk told investors that Tesla were developing a new battery pack which would be able to last one million miles, and that “the cars currently being built are all designed for a million miles of operation. The drive unit is design, tested, and validated for 1 million miles of operation.”

  • Stated that Tesla would pay every Model 3 owner $30,000 per year of passive income if they let Tesla use their Model 3 as a robotaxi when they weren’t using it.

Elon has even claimed that Tesla’s robotaxi fleet would generate income for Tesla owners if they rented out their vehicles, in the same model as Uber and Airbnb, leasing their vehicles via a ride-hailing app. He further claimed that this would generate up to $30,000 a year for Tesla owners, during an announcement at Autonomy Day in April 2019.

  • Repeatedly overstates the progress of the development of products that remain vaporware to this day.

A cursory Google search of “Musk unveils” and “Tesla reveals” returns a string of headlines over the years announcing various supposed achievements. The problem with these and other such “unveilings” is these products remain vaporware. And while the articles themselves acknowledge this, collectively they perpetuate the narrative about Elon being a manufacturing visionary. In the case of the Cybertruck, the Semi, and the Roadster, customers long anticipated these “products” for years, which either remain vaporware or have failed to meet Elon Musk’s expectations in terms of their roll-out and promised capabilities.

In July 2023, Tesla announced that the first Cybertruck had been rolled out of its Texas factory, to the satisfaction of investors and consumers who had waited for years for the model to be finally released. Consumers had been forced to wait for four years for the Cybertruck to be finally launched, having first been announced in 2019.

Another prime example of Elon misleading investors and the public is his claim from 2016 that he was launching the Boring Company, to “build a tunnel” to avoid traffic, in what became another of Elon Musk’s failed enterprises.

“It shall be called The Boring Company,” he added. “Boring, it’s what we do.”

In 2017, he tweeted that the plan was to “start digging in a month or so” as part of a tunnel system in Los Angeles, though at the same time it was revealed that he had failed to obtain permission from city planning officials to do.

  • Has made numerous baseless claims about landing on Mars

Elon’s baseless claims about being able to settle on Mars have a long history. Musk has been discussing his plans since 2004, according to a contemporary article by The Guardian. He claimed in 2011 that he would put “a man on Mars in 10 years“, which he then re-evaluated in a year’s time to a “12 to 15” year timeframe.

Like the Cybertruck, Semi and Roadster, Musk’s timeline has continued to shift over time, with Musk saying almost a decade later in November 2020, that it would take “10 or 20 years” to settle on Mars, a twenty year discrepancy from his first estimate in 2011. This shifting estimation underwent a considerable volte-face just a month later, when he told an audience in Berlin that he was “highly confident” that SpaceX would be able to land humans on Mars in “about six years from now“.

  • For each of the last eight years stated that Tesla would achieve full autonomous driving within a year.

In fact, Elon has been promising the delivery of robotaxis “next year” for the past 8 years. These have still not yet been delivered and are not even close to being ready:

Elon’s lies and manipulations of the truth when it comes to Full Self-Driving are well-documented. In 2019 he claimed that Tesla’s self-driving software would be able to “drive around a parking lot, find an empty spot” as well as “read signs“. He even claimed in 2016 that within two years self-driving would be able to drive from Los Angeles to New York without a driver:

At the company’s Autonomy Day in 2019, Musk claimed that by 2020, Tesla would have 1 million robotaxis on the roads. Investors, reporters and enthusiasts alike were enthralled by the idea that a Tesla would soon deploy a fleet of driverless robotaxis to revolutionize transit. Musk boasted during the event that within two years, Tesla would be making vehicles with no pedals or steering wheels.

Casting aside the mammoth task of achieving level 5 autonomy, Musk quipped that “all you need to do is improve the software”, stating that Tesla’s hardware was good enough to provide the base on which Tesla would build a fully autonomous car. This April will mark four years since Elon promised Tesla’s investors level 5 autonomy. As of today, Tesla has not achieved any of these lofty goals. There are zero Tesla robotaxis, and Full Self-Driving is still riddled with errors and fails in the most basic driving situations.

However, despite Elon’s public pronouncements that autonomy is just around the corner, Tesla assured the California Department of Motor Vehicles in December 2020 that none of Tesla’s “self-driving” software will ever drive a Tesla car without a human supervising it at all times.

So, Full Self-Driving will never be self-driving!  It will never live up to its name, despite Musk’s endless promises. It will always remain a fraud on Tesla’s customers, investors, and regulators.