Getty Images/iStockphoto

Tesla full self-drive software and autonomous vehicle safety

The automaker gave North American users access to its full self-driving system in 2022. However, critics say it fails to perform basic tasks and could lead to catastrophic results.

With hundreds of autonomous vehicles picking up passengers on the streets of San Francisco, Phoenix and other cities, the self-driving car industry has come a long way.

However, one full self-driving (FSD) software platform has been getting much criticism despite being in the hands of hundreds of thousands of consumers: Tesla's.

Tesla's FSD

Two years after introducing the FSD feature, the electric car manufacturer, whose cars are among the top-selling in the world, in 2022 made the software operational for all Teslas in North America equipped with the autonomous capability.

Although hundreds of thousands of people have used Tesla's FSD technology, in February the National Highway Traffic Safety Administration issued a recall of the FSD system stating that it may "allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane." Tesla answered the recall with an over-the-air software update.

Moreover, critics -- notably Dan O'Dowd, founder of The Dawn Project -- have called on Tesla to stop distributing the software, claiming it needs more work.

The Dawn Project advocates replacing software with software that "never fails and can't be hacked."

Despite owning five Teslas himself, O'Dowd has run ads in media outlets such as the New York Times and, most recently, during the Super Bowl spotlighting the alleged dangers of Tesla's FSD software.

O'Dowd's main point of contention is that Tesla's autonomous vehicle has been shown to run past road safety and closure signs and other safety signs. Multiple YouTube videos show the vehicle driving past school bus signs and mannequins.

"It's just not ready to be in 400,000 consumers' hands," O'Dowd said during an interview with TechTarget Editorial.

More users, better software

Tesla Investor Ross Gerber, CEO of Gerber Kawasaki Wealth and Investment Management, disagrees with O'Dowd. He argues that autonomously driven Teslas will be more successful with so many consumers driving and testing out the beta version of FSD.

The more people are given access to the software, the more they can provide feedback to the automaker, which can then make changes based on what works and doesn't work, he said.

"If we don't have it in lots of people's hands, we can't just gather data quick enough to make improvements rapid enough in the neural network so that it does get where it needs to be," Gerber said during a livestreamed event from Santa Barbara, Calif. on June 22. He and O'Dowd discussed concerns about the safety of Tesla's autonomous driving technology during the event.

The challenges of dealing with these scenarios are still not solved from any of these autonomous players.
Ross GerberCEO, Gerber Kawasaki Wealth and Investment Management

The media event was planned by both Gerber and O'Dowd in an attempt by the Dawn Project founder to show mistakes Tesla's FSD system can make. It was streamed by more than 10,000 viewers on YouTube and involved a discussion with the two men as well as a test drive of Gerber's Tesla.

Tesla says it is working on fixing some of its safety problems and the software has advanced to the point at which it disengages from FSD mode when drivers are not completely attentive and do specific tasks, such as looking at their phones.

The FSD capability can also tell if a driver is not looking on the road or doesn't have their hands on the steering wheel.

"It still needs a lot of work ... because of humans," Gerber said. "It's not necessarily because of the software. It's that the software has to work around humans."

All autonomous vehicles need to not only work around humans but also objects such as emergency vehicles, school buses, cones and other obstacles.

While Gerber admitted that Tesla needs to do more about some of those challenge points, he noted that other autonomous cars, such as Waymo and Cruise -- which currently have autonomous vehicles requiring no human interaction on the road in San Francisco as robotaxis -- also face the same problems.

"The challenges of dealing with these scenarios are still not solved from any of these autonomous players," Gerber said. "But with Tesla, the human has to sit in the car and still pay attention and basically be ready to drive that vehicle at any second."

"The way they're testing their software, in some ways, is safer than the way Cruise and Waymo are doing it where then if you have a problem, there's no human involved," he added.

To test their ideas, O'Dowd and Gerber set out for a test drive of Gerber's Model S, which has Tesla's latest software.

For most of their testing, the car stayed away from humans, stopped at stop signs and kept away from objects it needed to. But about an hour into the drive, Gerber had to take over when the FSD software failed to recognize a stop sign and two cars going by each other at an intersection.

"I told you humans have to be engaged," Gerber said, referring to the incident.

Image of Ross Gerber and Dan O'Dowd testing out Tesla's FSD version 11
Tesla investor Ross Gerber and critic Dan O'Dowd test out the automaker latest FSD software to address some of O'Dowd's criticism.

The automaker's responsibility

While Tesla warns drivers to be always engaged when driving its cars, it is still the manufacturer's responsibility to proactively put systems in place even if drivers don't listen, said Sam Abuelsamid, an analyst at Guidehouse.

Al Teslas from the 2024 model year have "autopilot" driver assist features. Some include limited autonomous driving, such as summoning cars in private parking lots.

Other automakers, such as GM, Ford, Nissan and BMW, have systems such as infrared driving monitoring cameras in place to ensure drivers' eyes are on the road. It will automatically stop their cars if they're not.

Some electric car makers use maps to geofence the system so drivers can only use the autonomous or full self-driving capability where it is safe.

"Fully automated vehicles are something that is progressing slowly," Abuelsamid said. He added that manufacturers such as Waymo, Cruise, Motional and Zoox are testing their software in environments where they are safe, either in locations with little traffic or at night.

"Those companies understand what the limitations of their systems are," he added. "For experimental stuff that they're not sure can work safely, they test those on a closed test track. They don't put them in the customer's hands. That's irresponsible."

While neither O'Dowd nor Abuelsamid said Tesla should take their vehicles off the road, both want the automaker to disable its FSD system.

"Tesla is fully capable of doing that with an over-the-air software update," Abuelsamid said. "That would address the problem."

Esther Ajao is a news writer covering artificial intelligence software and systems.

Dig Deeper on AI technologies