Founder of The Dawn Project Dan O’Dowd was interviewed by veteran technology journalist John Koetsier for the TechFirst podcast, where he discussed The Dawn Project’s mission to make computers safe for humanity and the group’s latest safety tests of Tesla Full Self-Driving.
Dan and John covered a wide range of events and themes relating to The Dawn Project’s campaign, including Elon Musk’s interactions with Dan, Tesla threatening the group with litigation and the current autonomous vehicle landscape more generally.
A link to Dan’s interview, together with a full transcript, can be found on TechFirst’s website, here. The full interview can also be viewed on TechFirst’s YouTube Channel, here.
A full transcript of Dan’s interview with John Koetsier can be found below:
John Koetsier: Elon Musk calls him “batshit crazy,” but Dan O’Dowd has built software for the F-35 fighter jet, the B2 bomber, and to secure U.S. nuclear forces. He’s also built a microprocessor for Mars, worked on the Mac at Steve Jobs’ request, and completed a ton of test drives of Tesla full self driving.
His conclusion: it’s worse than a drunk driver.
And: it does not recognize small children, leading to situations where it could cheerfully run them right over. (It does appear to recognize adults.)
In this TechFirst we chat with Dan about his tests, what he thinks is wrong with Tesla, and why Tesla is falling behind General Motors and Google (Waymo) in full self driving.
Dan O’Dowd is one of those guys that has a resume. I mean, he’s been building software since the ‘70s, wrote the software for Mattel Electronic Football, designed new microprocessors, he’s founded companies, he’s designed compilers at Steve Jobs’ request, he “rescued the Macintosh” and other things like that.
He also says he designed and wrote the only operating system that never fails and can’t be hacked, that secures the United States nuclear forces, flies the intercontinental nuclear bombers, the B1, B2, B52, their fighter jets, F-35, F-22, all of that, and designs a lot of unhackable software for phones and FBI computers.
Thank you so much for joining us. Dan, really looking forward to chatting. How are you?
Dan O’Dowd: Great, good to be here.
John Koetsier: Excellent. Talk about what you did with Tesla’s full self-driving. You ran a test. What happened?
Dan O’Dowd: Well, we’ve run a lot of tests to test a lot of different things.
Our first test was, there were some people doing this, there are lots of people doing YouTube videos that show full self-driving. I watched a lot of those videos and was very concerned with the number of failures and the quantity, the kinds of failures that were occurring, which struck me as way beyond what we should have out on the road. And so we decided to test some things.
Well, one of the first things we decided to test is, can it see a small child on the road?
And because all children are small, and we had seen people putting things like garbage cans and barbecue grills and things on the road, and it didn’t see them … it did not register those things on the road … so I said, well, a small child is smaller than some of those things.
So we got it, we bought a Tesla, put Full Self-Driving on it, and … and went out and went on roads and put the mannequins, now not real children, they’re store-type mannequins, and we just sent the Tesla in full self-driving mode to go forward, and it ran into them all the time.
And we were just, that was pretty shocking. We tried a lot of different things. We moved them across the road, like they’re walking across the road, put them in crosswalks, we put them on country roads, all sorts of different places and it, not always, but most of the time it ran into them. It did not see them.
When you’re going more than about 25 miles an hour, it doesn’t see them. It can’t. At very low speeds it can see them because it has lots of time.
John Koetsier: I drive a Tesla Model Y. I sometimes use autopilot. I never bought full self-driving. I didn’t see the value of that for the price that Tesla’s asking for it right now. I believe it’s like $12,000, $15,000, depending where you live and what currency you’re in.
What version of full self-driving were you using? And when were these tests? Were they recent, the past couple months?
Dan O’Dowd: Well, we started in June, I think, of last year, about a year ago. We published in August. I don’t remember the precise version number there. It was 10 dot something, but it was whatever the latest version was at the time.
We published, we made TV commercials to show people this and basically demand Tesla figure out what the issue is. Like, why can’t you see those children? Is the hardware incapable of doing that? Or if it is, what do you have to do to fix the software so it will recognize them?
It does recognize adults, an ordinary human, five feet and taller. It recognizes pretty consistently. And people said that, and that’s true.
But small children are quite a bit smaller, and they simply have gotten below some threshold that it can reliably detect these smaller people or maybe it was not trained on that as well … it could have just simply been trained on adults and then and they never tried it which I mean I don’t know how they do it I don’t know what their mechanism is but it doesn’t work.
So we said you have to fix it … well it’s now ten months later I think, and they still haven’t fixed it. We just were out there a few days ago again and every time there’s a new version we bring it out we run it again.
Did they fix it? They haven’t fixed it. Ten months. Do they not know how to fix it? Is it just fundamental? Or do they just choose not to fix it? I’m actually in the “choose not to fix it” category. If they spend 10 months and they can’t solve this problem, it’s never coming out. I mean, it just doesn’t work. If they can fix it but they don’t, then I have a real problem.
They know the product is defective. People’s lives are at risk and they keep selling it to people when they know it’s defective and they don’t even try to fix it or they don’t try hard enough. Maybe they try, but this should be a number one priority.
They should just have the whole team should be, why doesn’t it do it? What do we have to do to fix it? And they’re not doing that.
John Koetsier: Elon Musk’s companies in general are not well known for being super responsive to outsiders with questions, whether they’re journalists, whether they’re analysts, whether they’re governmental bodies or anything like that.
Has Tesla responded in any way to the tests you did?
Dan O’Dowd: Elon Musk called me batshit crazy.
John Koetsier: On Twitter, I assume.
Dan O’Dowd: Yes, yes. He said, my company Green Hills Software is a pile of trash. I can’t remember what else he said. He said like four or five things of that nature.
Oh, and then they sent me a cease and desist letter and it said that these tests aren’t real. They’re fake. It’s Tesla’s, apparently Tesla’s official position is that all of my videos are fakes and I lied about everything.
And that’s why they’re not doing anything, because they think both it’s just a fake, and it wouldn’t really do that. It wouldn’t really run over a child.
I keep inviting them. I’ve invited Tesla to come here, and I will show their engineers, if I have to go to their factory, I’ll show them how to do it in their parking lot. They’re not here. They didn’t ask. They keep saying, it’s just a fake. I brought more people. I bought journalists. We had a write-up in Motor Trend. They sent somebody out here to observe our tests. Quite recently, two weeks ago, I think. We’ve had other YouTubers … people have been coming by our place to check it out and running our tests. Those will be released, some I think are going to be released today. One of the prominent YouTube drivers, Tesla drivers, is going to publish some videos that we made together showing that, yep, it runs over, it still runs over those mannequins.
John Koetsier: Wow. Now, just to back up for half a second here because Elon Musk and Tesla have called it fake, called you batshit crazy.
You are legit the person who designed software for laptops for the FBI. You’re the person who designed an operating system for nuclear forces. You are not some fly by night crazy in his backyard somewhere?
Dan O’Dowd: Right, I mean, that’s what I’ve been doing for a long time. You know, I’ve been running this company for 41 years now. We have been writing software for lots and lots of different companies for lots of very significant products for all those years, and that’s what we’ve been focused on.
I do safety engineering about what is and isn’t the right process. How do we build airplanes that don’t fall out of the sky? It’s been a long time since there was an airliner crash in the United States. I don’t remember when the last one was, but it was many, many years ago. Airplanes have become really safe, the latest airliners.
John Koetsier: I think the latest one was that crash in the Hudson where that plane took off and hit a flock of seagulls landed in the Hudson. That’s got to be 20 years ago now, something like that. And people were evacuated on the wings. But yes. Okay.
Dan O’Dowd: Right? That one, no one died.
John Koetsier: Exactly. So you’ve designed and built software that is, I want to say mission critical, but that’s putting it lightly. I mean, we’re talking about software for tools and things that can kill literally millions of people if they don’t work right.
What’s the fix here? How does Tesla get it right? Assuming they start caring and listening, how do they get it right? How do you fix it?
Dan O’Dowd: Well, there is something that most people don’t seem to know. The people of San Francisco know it, but pretty much no one else, is that there are many companies that have driverless cars right now that work.
They’re not perfect, but my estimates are that a Tesla, Full Self-Driving, will foul up in some important way within 10 miles of driving. If you drive for 10 miles, it’s likely it will do something stupid. The Waymo cars, the Cruise cars, the Chinese have a bunch of companies too. There’s AutoX and Baidu and Toyota has a group for doing this. There’s about 10 companies that have real driverless cars. They go tens of thousands of miles without having a problem, without making a mistake, an accident of some sort.
And the Teslas are 10 miles, or tens of miles.
They celebrate when somebody does… a thing. We did that thing, I don’t know, did you, were you familiar with the controversy we had with Ross Gerber a few days ago?
John Koetsier: Yeah, I heard a little bit about it. Go into some detail.
Dan O’Dowd: So what happened is that I got upset because all the people on Twitter were all saying my tests are fake. Like, this is fake, it’s not real, it wouldn’t really hit a kid, it really wouldn’t go around a school bus with all the lights flashing. Oh yes it will.
And so I just said, I put all these videos together. I put a bunch of them together and I said, look at it. You can watch. I’ll go past the bus six times. It’ll go around and around and around. You put up, do not enter, citizens know what a Do Not Enter or Road Closed sign is.
So we put up, Do Not Enter, Road Closed, and police tape across it. It goes right through it. It doesn’t even slow down. It doesn’t see those signs.
And they said, that’s all fake, it’s all fake, it’s all fake. So I started challenging them. Somebody, I want one of you guys to come here. I’m going to put you in the car, and I’m going to photograph, make sure your feet don’t touch those pedals, and stop it, right? Make sure you don’t stop it. And we’re going to watch your hands, and we’re going to watch everything. And we’re going to say, go. and no one would take me up on it. No one would come, no one would come.
Finally, Ross Gerber, who is actually an investor, he’s not really a YouTube guy, but he does drive FSD every day, and he decided to come, and I said, fine. And he said, well, let’s go for a drive in Santa Barbara. He lives in Los Angeles somewhere. I said, let’s go for a drive in Santa Barbara.
He said, I’ll come to Santa Barbara, we’ll get in the car, and we’ll drive around, and you’ll see how great it is. And we drove around, and for 40, 50 minutes, You know, it worked fine. It made some mistakes, it did some dumb things, but it was working okay.
And then a big truck, like a garbage truck, was pulling out of a two-lane highway, pulling out of a driveway, it was coming around and backing up, and it had backed up, basically, into our lane. And it’s going beep, the thing they do when they’re backing up. And we’re coming up to the road, and Full Self-Driving decides it’s a bright idea to try to cut behind the truck that’s backing up into the other side of the lane, cross the yellow line into the other lane, which it couldn’t fully see what was going on behind this truck.
And Ross panics as we get around, he grabs the wheel, swerves around, hits the accelerator, because it almost stopped behind the truck that’s backing up. Okay, that’s not good. Two minutes later, we go past a stop sign at 35 miles an hour. I mean, I’m watching, we’re like, I’m sitting in the car, he’s in the car driving, quote ‘driving’, and I’m in the passenger seat.
And the stop sign, we see the stop sign, it’s 35 miles an hour when we cross the stop sign, he hits the brakes, bam, as hard as he can. We all, bam, you know, the car stops two feet short of a car that was crossing the intersection, legally, that had already entered the intersection. And so it was just crossing in front of us.
And we stopped about two feet short of that car. One hour drive, one time we came within two feet of a crash, of an actual T-bone crash. And the other time, that thing would have mangled his car, right? It would have just backed up, that truck would have just smashed in. We probably wouldn’t have been hurt, but the car would have been seriously damaged.
One hour.
John Koetsier: Wow.
Dan O’Dowd: It’s just terrible. The cars with Waymo and Cruise in San Francisco, they’re driving around San Francisco every single day with no driver in them at all and picking people up and giving them Uber-like rides throughout San Francisco.
And those things go tens of thousands of miles without doing something like that. And the Tesla does in our case. Maybe we went 20 miles in an hour because you know Santa Barbara was slow, we weren’t going fast. Maybe we went 20 miles and we had two problems.
John Koetsier: So for anybody who hasn’t been to San Francisco, it’s a challenging environment to drive in. It’s a very challenging environment to drive in. There’s lots of lanes, there’s lots of traffic, there’s lots of funky angled intersections that are not 90 degrees. It’s very busy, super congested, so it’s not an easy place to go. And there’s pedestrians who are walking whenever, wherever they can, all that stuff.
I was just in SF, I wanna say a month ago. And I was going like, why, why am I seeing these Waymo cars, like literally seven times, eight times in two days. And of course I’m in meetings most of the time, right? So it’s just when I’m popping out and I’ve seen the Waymo cars and they’re going around, I thought, you know, are they mapping? Then I realized they’re full self driving taxis taking people from A to B and that sort of stuff, and they’re doing it very, very effectively.
Why are we not seeing some of the other companies release these in? a mass release these technologies if they’re that good and so much better than the Tesla.
Dan O’Dowd: They’re still not good enough. They’re not good enough that you can count on being able to go anywhere and drive anywhere. They aren’t there, and they know it. They’re right now mapping out like 25 cities right now. They’re in 25 with the cars with a driver in it. I mean, a supervisor. Don’t really call them a driver, a supervisor, to watch over it.
And they’re mapping them out, and they’re driving every road and figuring out where every tricky turn is and where every traffic light is. So it just knows everything and it can drive in those places reliably and they will be branching out. Waymo just announced Santa Monica in Los Angeles. They’re going to start there shortly. And Cruise just announced Austin and Dallas, I think it was. It wasn’t Houston.
That they’re going to be starting there in a few days, weeks perhaps, but soon. And they’re expanding. And in China, they’re in lots and lots of different countries. Lots of… different places and big cities in China, there are these robotaxis as several Chinese companies are making robotaxis.
It isn’t yet, though Tesla delivers something that’s perhaps a thousand times less reliable to 400,000 consumers, Waymo says we’re not ready, even though they’re a thousand times better. They’re correct. They aren’t ready, but they’re way, way ahead. Actually, Tesla amongst the AV guys is regarded as a total joke. They all laugh like crazy when you mention Tesla and their AI and their training methods and whatever. They just said, these guys don’t know what they’re doing.
I mean, it’s just that simple. They do not know what they’re doing. They’re doing all the wrong things. They’ve never learned the lessons. They don’t know what to do. That’s what they say in private. They never say that in public because they don’t want Elon Musk to jump all over them and disrupt their operation.
John Koetsier: And get the Tesla attack dogs on him?
Dan O’Dowd: Right. They want to stay away from that, so they let me do the dirty work and go out there and point out Tesla’s problems rather than them point out Tesla’s problems.
They don’t want that. They don’t want that thing. They want to get it done and finished and really right so they really can deliver it to the world.
And it’s going to happen. Might be a few years, but it’s not going to be a lot of years, I don’t think. Those are really doing quite well.
John Koetsier: I have some experience with the Tesla attack dogs because I have written articles that have not been super complimentary of Elon Musk or Tesla in some cases a number of years ago. And the amount of feedback that you get feedback that you get …
Dan O’Dowd: I know.
John Koetsier:… is very significant. And some of it is also more intellectual than the feedback that Elon Musk gave you.
That is … it’s literally shocking, however, because I don’t think there’s any disputing that Tesla has the largest fleet of cars out there that is gathering data all the time. I’ve enabled almost every option in my model Y. Yes, you can see what I’m doing, what’s going on. I disabled the send information about the internal camera in the car cabin.
But, you know, you can see what I’m doing when I’m parking and all that stuff. And so I’ve enabled all that stuff. And I think that many hundreds of thousands of other Tesla owners have done the same. It’s shocking that with all that data and with all that insight, they can’t build something that is more bulletproof than a Waymo or a General Motors Cruise or something like that.
Dan O’Dowd: Right. My best understanding from the people who know about these things is that they’re clueless. That they have all that data, they don’t know what to do with it. They think they’re the smartest guys on the planet. They think they’ve got everybody beat, but everybody else can build a self-driving car.
It’s basically, I guess, not that hard. Ten companies have done it, and only Tesla hasn’t. And also, another thing is they all say, we don’t think it can be done just with cameras. We think you need LIDAR, we think you need radar. Everybody who has built a successful self-driving car says you need LIDAR, you need radar, and the one company that’s failed refuses to use LIDAR, refuses to use the radar in their system.
Again, there’s a balancing act here, and all the weight seems to be on the Waymo side and not the Tesla side.
They’re all completely convinced that some magical thing is going to happen with the AI. It’s like going to awaken one day and suddenly become smart and not run down children and not go past school buses and tons of things that it does wrong.
We’ve documented a dozen. We have ones weaving back and forth across the road. And this is a rare event, but it’s really scary. It’s happened to me twice. It will get frustrated on the left turn. That’s how I just, that’s how I say it. It’s how it feels. You’re on a left turn and the cars just keep coming and coming and coming and coming. And there’s no light, there’s no traffic light to get through it. They just keep coming and coming and coming. And at some point it just says, I’m gonna go for it. And it just jumps out in front of those cars. And you’re just completely shocked because you see the cars coming. You know, now we’re not gonna go. We’re still waiting for our chance.
It really, that’s how it seems. That at some point it just says, I’m tired of waiting, I’m gonna do it. And it’s … scary when it does that because there’s a car coming at you fast, it’s 50 or 70, 80 feet away. It happened to me, we documented it every frame by frame, the car was 70 feet in front of us, the car coming our way was 70 feet in front of us and we just turned in front of it.
It’s really frightening when that happens because that’s not just a little problem, it’s a “you could die” problem if it did that and you ran into those guys. And they don’t, they’re not, they don’t have a traffic light. They don’t have any reason to slow down. They’re just zooming by.
John Koetsier: Yeah, yeah. It’s interesting and it makes me think of as the overall car fleet becomes smart. And that’s a process that’ll take decades, of course, because people keep their cars for a long time in some cases and you’ve got classic cars that will never get updated with smart technology, but at some point, we’ll have a fleet that is largely cars that can communicate with each other.
I’m here, I’m here, I’m here, I’m here, those sorts of things and hopefully avoid some of that stuff. It’s disappointing to learn that Tesla full self-driving is that bad. We’ve had, it’s got to be six or seven different Elon Musk claims over the years. We’re almost there. You’ll be able to rent out your car by the day. You’ll have a full self-driving fleet of a million cars, a million robotaxis.
And I think he’s learned his lesson to not give those predictions.
Dan O’Dowd: No, he didn’t. He said it again just in the last two days somebody showed me a clip this morning of him doing it like yesterday or day before yesterday. And he went out, he did it again.
Where I think we’re gonna get, he even said, I know I’ve been wrong about this in the past, but I think that this time, we’re gonna have it by the end of the year. Full Self-Driving. Yep, he said it.
John Koetsier: Hope springs eternal.
Dan O’Dowd: Well, I’m not sure it’s hope. I think it’s desperation. It’s like he has to say that because the obvious situation is we’re not even close.
He has to say that to try to keep people from realizing that we’re not even close. He just keeps pushing that out there as his marketing message to people to get them to keep buying Full Self-Driving products, which I find despicable, that he knows, I know that it has these terrible safety defects in it, and he’s selling it.
He’s out there pitching it, they’re actually saying it’s five times safer than a human driver.
Well, I did a little study on that one too. It’s worse than a drunk driver. We just go out and drive and it will get to an intersection and it will stop in the middle of the intersection and the steering wheel will be flipping back and forth trying to decide what to do.
Does an ordinary human driver do that? No, they just drive through the intersection and go where they’re going. They don’t stop in the middle. We had it try to drive the wrong way on a roundabout and got confused and it was going to go the wrong way. Again, how does it decide? It’s crazy. Oh, and blew a stop sign, went right through a stop sign that we didn’t even know was a problem. We had never seen it before and it just didn’t see the stop sign.
It was partially obscured by a palm frond, but we didn’t miss it, right? The human drivers see it’s a palm frond in front of a stop sign. It sees it, but it thinks it doesn’t look like a stop sign to me. That’s not what I was trying to train on stop signs, not stop signs, but palm fronds of all directions in front of me, that I would guess is the problem there.
John Koetsier: Humans are shockingly good at pattern recognition. Hey, you see a fraction of something and you know what it is, even from a distance. It is pretty amazing.
So just to clear up, I’m sure that I haven’t checked this, but I’m sure that if the Tesla lovers have been sicked on you, people have said, oh, you’re a short seller or something like that. Is that the case?
Dan O’Dowd: Oh, I’m a murderer. I’m a murderer. That’s what they say. No, literally. Two, three days ago, one of the more reasonable Tesla fans, Tesla boomer mama, held a space to try to figure out what amongst their people to talk about some stuff. I joined.
And so the fearless leader of the opposition forces, Mr. Omar’s blog, or Omar Quazi, came out and said, you’re a murderer, you’ve killed a million people, it’s your responsibility because you didn’t build an autopilot, but you say you can know how to build software, that’s really great, but you didn’t build an autopilot and a million people died, so that, like, I killed a million people because I didn’t build autopilot?
It’s like, well, wait, no, I don’t even know how to do that, that’s not my field, and besides, it costs billions of dollars to do that, I don’t have that kind of money.
John Koetsier: Yeah.
Dan O’Dowd: But he said it in public, in front of everybody that I killed a million people. Yeah. Shorts … I don’t have any, I don’t have any shorts, I don’t have any puts, I don’t have anything, no calls, no stock, no nothing in Tesla.
I don’t, I’ve never shorted a stock in my life. I know it’s crazy. No matter how good your estimate is of how bad the company is, it doesn’t mean the stock can’t go the other way. If the acolytes and the believers buy it, it goes up and you get killed.
John Koetsier: As we’ve learned with the meme stocks. Absolutely. The AMCs.
Dan O’Dowd: That’s it. Right. And this is the meme stock. That’s what it is. And it’ll kill you. And this is the meme stock.
John Koetsier: Wow. So there’s a lot to digest here. There’s a lot to think about. I want to thank you for taking the time to chat about it and, um, and also for being open to being challenged and inviting people to, uh, come and check your videos and ride along with you.
I’m not sure the human race has ever been fully rational, but it seems like in the past few years, we’ve really lost our ability to have debates over facts and debates over perceptions and rather we throw insults or we throw insane language at each other and accuse each other of crazy stuff.
Thank you for sharing that. And I really appreciate this time.
Dan O’Dowd: Alright, good to talk to you.