Here’s what needs to happen to achieve safe self-driving cars

Published 8:16 am Monday, September 11, 2023

Elon Musk’s vision for Tesla  (TSLA) – Get Free Report is not just to revolutionize the push toward electric vehicles. He wants to revolutionize the entire industry, notably by selling a car that lacks a steering wheel but has the ability to drive itself. 

Tesla has since released a few versions of advanced driver assist software, Autopilot and Full Self-Driving. But Tesla’s FSD is a bit of a misnomer; cars using FSD do not truly drive themselves. The driver has to be ready to take over if necessary, as the self-driving software is still far from being safe and reliable, something Musk himself inadvertently proved during a recent demo of the tech. 

Related: Tesla chief Elon Musk says he’s ‘not building a house anywhere’ in wake of federal investigation

Despite constant promises that true FSD is right around the corner, the tech is just not yet at a safe point for mass distribution, UChicago computer science professor and AI expert Bo Li said in an interview with TheStreet. 

The road to self-driving cars

There are three main things that need to happen to propel self-driving cars forward. 

“First of all, rigorous standardized testing is very important,” Li said. “Every company has their own simulators and testing platforms, but there’s no standardized testing yet.”

A standard, third-party baseline will ensure a more objective perspective of an individual model’s safety and vulnerabilities.

The second step involves a different approach to ensuring the trustworthiness of a car that is being powered and controlled by an artificial intelligence model: increasing the number of models. 

A single model in control of a car, Li said, is “very dangerous.” But a self-driving car being powered by multiple, segregated components is much more sophisticated, and therefore, much safer. 

“We need to have various components to work together with a car, for instance, adding the traffic rules as a logical knowledge database, or adding the regulations so that the cars have not a single model,” Li said. 

More Tesla:

The last step, specifically for those cars that are trained on real-world data, is that researchers need to be positive that self-driving cars are behaving as expected in normal cases, but also in rare events. The issue with this is there is far more data on normal events — driving down a highway, for instance — than there is on the kind of rare events that can result in accidents. 

“When you train a model on real-world collected data, you’re not seeing those patterns enough, and this leaves a lot of holes,” Li said. “The model could be vulnerable in those special events.” 

With analysts and investors bullish on the possibility of a thriving robotaxi industry by 2030, Li thinks it’s difficult to know how long it will take to actually get there. But she said it’ll likely be longer than optimistic end-of-decade prediction — researchers and engineers both need “slightly more time to understand” the new AI models that are coming out. 

She’s not even sure that fully self-driving cars in every environment will ever be truly possible (or safe.) The tech, Li said, can be leveraged well in simple scenarios, such as highway driving. 

“But for generic autonomous driving in urban places, I think it’s very hard because the scenarios are too complicated,” she said. “For models, we cannot control the outcome. I think at this stage, we should control the use-case of scenario and domains to benefit from them.”

“I feel it’s very dangerous to let them just go widely in any scenario.”  

Tesla is currently facing a number of investigations into the safety of its FSD software

If you work for Tesla, contact Ian by email ian.krietzberg@thearenagroup.net or Signal 732-804-1223

Forget Tesla – We’re all-in on this EV stock

Marketplace