If you were designing a Full Self-Driving car, would you schedule recognizing and obeying ‘Do Not Enter’ signs before or after sending it out to 160,000 Beta customers? What about ‘Road Closed’ signs? What about stopped school buses with their stop signs extended and their lights flashing? What about strollers?
I have repeatedly seen Elon Musk’s Full Self-Driving software completely ignore ‘Do Not Enter’ signs and ‘Road Closed’ signs. It will gladly self-drive right between two ‘Do Not Enter’ signs and two ‘Road Closed’ signs. It will repeatedly overtake a stopped school bus with its stop sign arm extended and its lights flashing, while children are getting off the bus. It will speed in school zones, and it will run over child mannequins in strollers.
I have seen no evidence that Elon Musk’s Full Self-Driving software even knows what a ‘Do Not Enter’ or ‘Road Closed’ sign is. They don’t register on the internal display. It doesn’t even slow down for them.
In eight years of development, it seems that no one in Tesla engineering management ever thought of scheduling some engineers to teach Full Self-Driving how to recognize and obey ‘Do Not Enter’ and ‘Road Closed’ signs! This requires a really special level of incompetence in engineering management.
What about the engineers implementing Full Self-Driving? Did every one of them go 8 years without it occurring to them even once that they hadn’t gotten around to recognizing and obeying ‘Do Not Enter’ and ‘Road Closed’ signs? I see ‘Do Not Enter’ and ‘Road Closed’ signs every day. ‘Do Not Enter’ signs are very important. They keep you from making disastrous mistakes, like driving the wrong way down a one-way street or entering a construction area with an open trench. ‘Road Closed’ signs prevent you from trying to cross a bridge that is down or drive into road construction equipment or obstructions in the road.
In eight years, did it never occur any Tesla engineers that Full Self-Driving needed to recognize and obey ‘Do Not Enter’ and ‘Road Closed’ signs? Or, did they think it wasn’t important to avoid driving the wrong way down a one-way street or driving though a construction zone with an open trench.
Does Tesla Engineering think marketing gimmicks like “auto-park” and “Optimus” are more important than safety critical ‘Do Not Enter’ and ‘Road Closed’ signs?
There is also exceptional incompetence in Tesla’s Full Self-Driving engineering testing department. How could they not have tested how Full Self-Driving cars react to ‘Do Not Enter’ and ‘Road Closed’ signs? In eight years, it never occurred to any of them even once to test ‘Do Not Enter’ and ‘Road Closed’ signs? Our testers found this out in a matter of days! And they were shocked that any person claiming to be a test engineer for a Full Self-Driving car would not think of this.
And didn’t Elon Musk, ‘Technoking’ of Tesla, who personally leads the Full Self-Driving team, ever think of ‘Do Not Enter’ and ‘Road Closed’ signs? Or does he think that having Tesla engineers rewrite the Twitter algorithm is more important?
Tesla’s Full Self-Driving engineering team is incompetent, which is little surprise given that it is led by the self-proclaimed “TechnoKing” of Tesla, Elon Musk.
Is this an isolated incident? Is it some bizarre coincidence that the management, the programmers, the testers, and Elon Musk would all independently make the same mistake? No, it’s par for the course for the Full Self-Driving engineering team. It will shock you as we release more and more videos demonstrating how many other completely obvious safety critical things that our testers have found that Tesla has not even started working on!
Tesla warns that its Full Self-Driving car requires a fully attentive driver.
Our tests show that the driver has less than one second to override each of Full Self-Driving’s poor driving decisions every 8 minutes, less than a second to override a safety critical errors every 30 minutes, and less than a second to stop it from killing someone every few hours.
Who in their right mind would deliver such a defective product to 160,000 customers?
Tesla warns that Full Self-Driving “may do the wrong thing at the worst time.” For instance, it may cross the double yellow line (the wrong thing) while there is an oncoming car just 60 feet away (the worst time). That happened to me and I have it all on video:
What sort of sociopath would deliver 160,000 two-ton robot killing machines that “might do the wrong thing at the worst time” to the general public to drive on our public roads?
Elon Musk’s Full Self-Driving seems to have the driving skills of a suicidal drunk thirteen-year-old: Drunk because it makes so many irrational decisions, suicidal because it keeps trying to crash into oncoming traffic.
Every time I release a video showing a Tesla doing something crazy the Tesla fanboy apologists scream:
Elon Musk’s fanboys say, ‘of course, it has bugs, IT’S BETA.’ They seem to think that slapping a “Beta” label on a Full Self-Driving car excuses it to make many terrible mistakes on our public roads.
Elon Musk claims to have discovered a new, far better engineering process that puts Tesla way ahead of everyone else in innovation. He says traditional engineering processes are bureaucratic and prevent success. But Tesla’s engineering process is objectively terrible. We ran a survey and 89% of registered voters agreed that ‘Do Not Enter’ signs must be recognized and obeyed before a Beta release. Similarly, 93% of registered voters agrees that a Full Self-Driving car must ensure that it won’t run over small children in a crosswalk before it is Beta released.
It seems that almost everyone who doesn’t own Tesla stock or stock options believes that the current version of Full Self-Driving has not yet reached sufficient maturity for a Beta release. What can we conclude from that?
Elon Musk’s Full Self-Driving software IS NOT BETA!
It is an engineering prototype. They should not be on our roads until all of the horrendous bugs that we and the 160,000 other Beta drivers have found and reported have been fixed.
The glacial pace of Tesla’s Full Self-Driving engineering development has allowed many of its competitors to surpass them in self-driving. While Tesla’s Full Self-Driving still requires a fully attentive driver, many of Tesla’s competitors are already on the road driving in major cities across the world without a human supervising it.
Tesla’s competitors successfully drive tens of thousands of miles without making a serious mistake, a thousand times better than Tesla’s Full Self-Driving nightmare.
The engineering at many “legacy” auto companies, that Elon Musk loves to ridicule as dinosaurs, GM (Cruise), Toyota (pony.ai), and Hyundai (Motional) have already achieved full self-driving while Tesla’s Full Self-Driving engineering is the butt of everyone’s jokes, as it falls farther and farther behind each year. Elon Musk is like a kid on a tricycle trying to catch up to the peloton in the Tour de France.
Every year, for the last 8 years, Elon Musk has promised that within a year or so, Tesla cars will be fully self-driving. But every year he reneges on his promise and makes the promise again for the next year. By developing software many times faster than Tesla, now, every day, many of Tesla’s competitors do what Elon Musk is promising he can do next year, full self-driving. But then he will be promising it for the year after that.
Whose fault is this?
Maybe it is not the individual Tesla engineers’ fault that Full Self-Driving’s engineering sucks. There is plenty of evidence that many of the worst engineering decisions affecting Full Self-Driving came straight from Elon Musk, ‘Techn00b’ of Tesla.
Most telling of all is that The Dawn Project publicly reported months ago that our test engineers had shown that Full Self-Driving would run over a child in a crosswalk. We surveyed 1000 registered voters and 93% of them agreed that a Full Self-Driving car that would run over a child in a crosswalk should be banned from our roads immediately. Recently when we tested the latest version of Full Self-Driving (10.69.2.2) we found that it will still run down a child in a crosswalk. Three months have gone by and Tesla engineering still hasn’t fixed a problem that nearly everyone agrees should ban it from our public roads.
It looks more and more as if the problem isn’t that they don’t know about the ridiculously dangerous problems in Full Self-Driving, but that they just don’t give a damn.
No wonder Full Self-Driving is the worst commercial software I have ever seen. Tesla’s Full Self-Driving engineering department should be terminated for incompetence above and beyond the call of duty and the Full Self-Driving product should be consigned to the dustbin of history, where it belongs.
The Full Self-Driving software that billions of lives will eventually depend on every day must be the best software ever written, not the worst!
As Elon Musk has come to realize that his Full Self-Driving software has fallen way behind the competition and the public is starting to realize this as well, he has had to come up with increasingly bizarre rationalizations to defend his continuing claims that his Full Self-Driving software is still the technology leader.
Elon Musk says that his competitors’ self-driving achievements are irrelevant because they are “geo-fenced.” That is, they can only drive in a few cities, not everywhere, like Tesla. This is just misdirection. To be clear, many of Tesla’s competitors can drive with no human supervision or intervention in many cities. While Tesla can drive driverless nowhere! These other company’s cars can also drive anywhere, but they require a human backup driver in those places, just like Tesla. The list of cities that allow driverless cars is growing exponentially, and soon will cover nearly all urban areas in North America and China while Tesla will still require a driver for their Full Self-Driving cars!
HIGH DEFINITION MAPS
Elon Musk’s claim that his system, which does not work after 8 years of development, is far ahead of everyone else because when it works it will work anywhere even where there is no map. I can’t remember ever wanting to drive somewhere that there wasn’t a map. And if I did want to do some off-road driving I would want to do it myself.
Elon Musk claims that his competitor’s systems won’t scale because they depend on high definition maps of every place you may want to go. Most of the errors I see Full Self-Driving make would be avoided if it had a high-definition map. But Elon Musk, Techn00b of Tesla, claims it is impossible to scale up the high definition maps of a few cities to high definition maps for the whole world. Wait a minute. One of those companies that is way ahead of Tesla is Waymo, which is owned by Google, which has already published high definition Street View maps of all the world’s public roads many times over. In addition, there are mapping companies that will sell all the mapping data anyone needs for a self-driving car. Excluding high definition mapping from Full Self-Driving is another terrible engineering decision by the much vaunted Tesla Full Self-Driving engineering team.
Another reason Elon Musk says high definition maps won’t work is that they are out of date the minute they are published. Every road reconstruction obsoletes the maps. This is ridiculous – high definition maps are continuously updated by self-driving cars that compare the map to what they see. If there is a discrepancy, or a change in the street, including routine road maintenance, it will be recorded by the onboard cameras and sent to the mapping company to immediately correct the map and send out map changes to all cars in the area. The map will be corrected or updated as soon as a self-driving car passes by a change in the road.
Beyond general incompetence in implementing Full Self-Driving there have been some really bad hardware engineering decisions as well. Elon Musk has argued virulently against using LIDAR in Tesla cars. Every other autonomous car company that has achieved driverless status uses and swears by LIDAR as essential to their success. Elon Musk refuses to use LIDAR and he has failed for 8 years to deliver a driverless car. Coincidence? Incompetence?
For years Tesla cars were equipped with a front facing radar to detect objects ahead of it and slow the car down or stop it if it was too close. Not long ago, Elon Musk announced that he would not be building radar into any more cars. Now our tests show that Tesla Full Self-Driving cars can’t detect children in a crosswalk and it will run them down. Coincidence? Incompetence? Negligence? They just don’t give a damn about anything but pumping up the stock price.
More recently, Elon Musk announced that going forward, Tesla cars will no longer be equipped with ultrasonic sensors around the car. As a result, a number of features like autopark and Smart Summon will no longer be available, until they can be replaced with new versions that only use cameras (if ever).
Elon Musk keeps removing sensors from Tesla cars making Full Self-Driving’s job even harder when it still can’t even come close to living up to its name.
When you come to an intersection where you have a stop sign but the cross traffic has no traffic control you have to move into a fast moving stream of cars from a standing stop. For a left turn you need to get a good look a fair way down the road in both directions to find a slot big enough for you to turn into without disrupting the cars in the cross traffic. Often when you try to look left or right there is some obstruction: bushes, trees, garbage cans, or cars parked on the street. If you feel that you can’t see far enough to safely assess when to advance into the intersection you move the car forward for a better view. If you still can’t see far enough you lean forward until you are crushed up against the steering wheel, then you crane your neck forward and look left and right. Almost always you can finally get enough of a view to determine when it is safe to turn left. This is because the people who design and maintain intersections know how far people can (or will) move to see cross traffic and how far forward they can bend and crane their necks. If drivers can’t see adequately they will complain to the road department to fix the problem. If the road department doesn’t fix it there will be more and more accidents at that intersection until litigation, an influential person, or the community in general demands it.
The cameras that Full Self-Driving uses to see the cross traffic are mounted on the B-Pillar between the front and rear doors on each side. When the driver is sitting back in the driver’s seat the camera is about 8 inches farther to the rear of the car than the driver’s eyes. So, if the driver feels that it is unsafe to proceed into traffic from the driver’s vantage point, FSD which is 8 inches farther back, which can see even less, must determine that it is not safe to proceed, ever. FSD is stuck, it can’t safely navigate this intersection and make the left (or right) turn. But a human can lean about 30” forward from the cameras on the B Pillar to get a better view and determine when it is safe to proceed. FSD can’t safely navigate any intersection in which you feel that you must lean forward to get a good enough view to be safe!
This is a hardware engineering problem. It should have been obvious to Tesla engineering for years. But Elon Musk promised repeatedly that if anyone bought a Tesla after 2016 that it was capable of Full Self-Driving with the hardware they bought. Well, it is not, and there is not much that can be done about it for existing cars. You would need to move the old cameras much farther forward or install new ones. That would require drilling holes in the car and redoing the wiring. What a mess. That is probably not feasible.
In an interview Elon Musk even bragged about the brilliant positioning of the B-pillar cameras. So, maybe the engineers at Tesla know that this problem will prevent any existing Tesla cars from ever being full self-driving, but they are afraid to tell Elon because they know that he thought the camera placement was excellent. Piss poor engineering all around.
Elon Musk has repeatedly said, and it has been echoed by others endlessly, that the fact that humans can drive a car using just their eyes, proves that LIDAR, radar, and ultrasonic sensors are not required to drive a car. So, why should a Self-Driving car with 8 eyes (cameras) need LIDAR, radar, or ultrasonic sensors?
When I was informed that the 8 cameras built into every Tesla since 2016 have no stereoscopic overlap, I didn’t believe it. No one could be that stupid. Sure, humans can drive a car without LIDAR, radar, and ultrasonic sensors, but we have stereoscopic vision that gives us the distance to everything in front of us (which is what LIDAR, radar, and ultrasonic sensors do). Without LIDAR, radar, ultrasonic sensor, and stereoscopic vision, Full Self-Driving has no direct way of determining the distance of things in front of it.
With 8 cameras they only had to dedicate two cameras to the most important direction (ahead) to get stereoscopic vision. What devotee of camera only self-driving would not think of this? Oh yes, the so-called TechnoKing!
Elon Musk’s real objection to the LIDAR based solutions of his competitors seems to really be cost. Apparently, each of these Waymo cars is a couple hundred thousand dollar custom vehicle. He says that it is simply too high for a consumer car. But these are just low volume prototype cars. And when Elon Musk formed this opinion LIDAR was very expensive, but in the last few years its cost has come down to where it is practical to incorporate into consumer vehicles.
One of Elon Musk’s typically bizarre claims is that his Full Self-Driving software is far ahead of its competitors because it is driving 160,000 consumer cars, while his competitors have only a few thousand cars driven only by their professional test drivers. Elon is citing his recklessness and his competitors’ responsible behavior to prove his product’s superiority! Only an enthralled fan boy could fall for that hook, line, and sinker.