Founder of The Dawn Project, Dan O’Dowd, has recently been interviewed by The Primary Loop for to discuss the risks of hackable autonomous systems and how to ensure that they are safe.

Dan’s interview was published on 5 November 2023, and can be found here.

Dan discussed the vulnerabilities of code which is used in safety-critical systems, and how to keep it safe from attack, particularly when used in the military, in aviation and in industry.

When asked about the security of consumer devices, Dan commented: They have a very minimal level of security on them. They don’t even meet the requirements that you see in big IT installations, which are being hacked all the time.”

Dan further discussed the security risk in self-driving vehicles, noting that: “Consumer devices, like cars, are a terrible risk, because they’re deadly, that if they fail, they can kill people. And systematic failures in those sorts of devices can cause large catastrophes, and so can hacking. Somebody could take control of a self-driving car, well, they can hack a self-driving car. But we have over-the-air updates, so everything gets updated to new software very rapidly. So, if there’s a bug in one car, there’s a bug in all those cars running that same software, which could be millions. And then they’re all on the internet. So, they can all be accessed by somebody, by a hacker of some sort. So now they can get into all the cars and do something as simple as turning to UK driving mode from the US, telling the cars to drive the wrong way on the wrong side of the road. Imagine a million cars suddenly driving on the wrong side of the road.”

Dan also commented: “We need to have a far higher protection on the software than that, and security – which does exist in the military. They have much stronger security measures than are used in any commercial systems. And in the aircraft there are reliability processes, where people go through the software, it gets reviewed many, many, many times. At Tesla, they change the braking software, and they ship it to customers the next day.”

In discussing Tesla’s lack of safety testing, Dan argued: “You can’t do that, you have to test it for weeks and months. You want to test it in the Saudi Arabian Desert, you want to test it in Nome, Alaska, you want to test it in a monsoon in Pakistan, before you deliver it to hundreds of thousands of customers. Not the next day where you can say “oh, we fixed it, we made this improvement in one day.” To me that’s horrifying, not comforting.”

Dan further highlighted Tesla’s failure to design its safety-critical self-driving software, commenting: “You design your system, you discover, you analyse what parts are really critical, and you do the parts that are really critical, very carefully. That’s what we should do. But they’re not doing it.”