By Akash Sriram and Abhirup Roy
(Reuters) – A self-driving Tesla (NASDAQ:) carrying an Uber (NYSE:) passenger rammed an SUV at an intersection in a Las Vegas suburb in April, an accident that sparked new concerns that a growing stable of self-styled cars’ robotaxis’ exploits a regulatory gray area in American cities, putting lives at risk.
Tesla CEO Elon Musk plans to unveil plans on October 10 for a robotaxi, a self-driving car used for ride-hailing services, and has long considered a Tesla-operated taxi network of autonomous vehicles owned by individuals.
However, DIY versions are already emerging, according to eleven taxi drivers using Tesla’s Full Self-Driving (FSD) software. Many say the software, which costs $99 a month, has limitations, but they use it because it helps reduce drivers’ stress and therefore allows them to work longer hours and make more money.
Reuters is the first to report on the Las Vegas crash and a related investigation by federal safety officials, as well as hail drivers’ widespread use of Tesla autonomous software.
While test versions of self-driving taxis with human backup drivers from robotaxi operators such as Alphabet’s (NASDAQ:) Waymo and General Motors (NYSE:)’ Cruise are heavily regulated, state and federal authorities say Tesla drivers are solely responsible for their vehicles, regardless of whether they use driver assistance software. Waymo and Cruise use test versions of software that are categorized as fully autonomous, while Tesla FSD is categorized as a level that requires driver supervision.
According to the police report, the other driver in the April 10 Las Vegas crash, who was taken to the hospital, was accused of failing to yield the right of way. Las Vegas Tesla driver Justin Yoon said on YouTube that Tesla software failed to slow his vehicle even after the SUV emerged from a blind spot created by another vehicle.
Yoon, who posts YouTube videos under the banner “Project Robotaxi,” was sitting in the driver’s seat of his Tesla, with his hands off the steering wheel, as it entered the intersection in a Las Vegas suburb, footage from inside the car showed. The Tesla on FSD navigated the vehicle at a speed of 74 km per hour and initially did not register an SUV crossing the road in front of Yoon. At the last moment, Yoon took control and turned the car in a deflected blow, the footage shows.
“It’s not perfect, it will make mistakes, it will probably continue to make mistakes,” Yoon said in a video after the crash. Yoon and his passenger suffered minor injuries and the car was totaled, he said.
Yoon discussed the use of FSD with Reuters before publicly posting videos of the accident, but did not respond to requests for comment afterward.
Tesla did not respond to requests for comment. Reuters could not reach the Uber passenger and other driver for comment.
Ride companies Uber and Lyft (NASDAQ:) responded to questions about FSD by saying drivers are responsible for safety.
Uber, which said it had contact with the driver and passenger in the Las Vegas crash, cited community guidelines: “Drivers are expected to maintain an environment where passengers feel safe; even if driving behavior does not violate the law.”
Uber also cited instructions from Tesla warning drivers using FSD to keep their hands on the wheel and be ready to take over at any time.
Lyft said: “Drivers agree that they will not engage in reckless behavior.”
BIG AMBITIONS
Musk has big plans for self-driving software based on the FSD product. The technology will serve as the basis for the robotaxi product software, and Musk is considering creating a Tesla-managed autonomous ride service using vehicles owned by its customers when not otherwise in use.
But the drivers who spoke to Reuters also described critical shortcomings of the technology, including sudden unexplained acceleration and braking. Some have stopped using them in complex situations such as airport pick-ups, navigating parking lots and construction zones.
“I use it, but I’m not completely comfortable with it,” says Sergio Avedian, a taxi driver in Los Angeles and a senior contributor to the YouTube channel “The Rideshare Guy,” an online community of taxi drivers. with almost 200,000 subscribers. Avedian avoids the use of FSD while transporting passengers. However, based on his conversations with fellow drivers on the channel, he estimates that 30% to 40% of Tesla hail drivers in the US regularly use FSD.
FSD is categorized by the federal government as a form of partial automation that requires the driver to be fully involved and attentive while the system steers, accelerates and brakes. There has been increasing regulatory and legislative scrutiny, with at least two fatalities involving the technology. But using it for buckshot is not against the law.
“Ride-share services enable the use of these partial automation systems in commercial environments, and that is something that should be thoroughly explored,” said Guidehouse Insights analyst Jake Foose.
The U.S. National Highway Traffic Safety Administration said it was aware of Yoon’s crash and was contacting Tesla for additional information, but did not respond to specific questions about additional regulations or guidance.
Authorities in California, Nevada and Arizona, which oversee the operations of ride-hail companies and robotaxi companies, said they do not regulate the practice because FSD and other similar systems fall outside the scope of robotaxi or AV regulation . They did not comment on the crash.
Uber recently enabled its software to send details of passenger destinations to Tesla’s dashboard navigation system — a move that helps FSD users, wrote Omar Qazi, an X user with 515,000 followers who posts using the handle @WholeMarsBlog and often public receives answers from Musk on the platform.
“This will make it even easier to take Uber rides on FSD,” Qazi said in an X post.
Tesla, Uber and Lyft have no way to tell if a driver both works for a taxi company and uses FSD, industry experts say.
While almost all major automakers have some version of partial automation technology, most are limited in their capabilities and restricted for highway use. On the other hand, Tesla says FSD helps the vehicle drive itself almost anywhere with active driver supervision, but minimal intervention.
“I’m glad Tesla is doing it and getting it done,” said David Kidd, senior research scientist at the Insurance Institute for Highway Safety. “But from a safety perspective, it raised a lot of hairs.”
In lieu of new regulations, Kidd said NHTSA should consider providing basic, non-binding guidance to prevent misuse of such technologies.
Any federal oversight would require a formal investigation into how taxi drivers use all driver-assistance technology, not just FSD, said Missy Cummings, director of the George Mason University Autonomy and Robotics Center and a former adviser to NHTSA.
“If Uber and Lyft were smart, they would lead the way and ban that,” she said.
Meanwhile, taxi drivers want more from Tesla. Kaz Barnes, who has taken more than 2,000 passenger-carrying FSD trips since 2022, told Reuters he was looking forward to the day when he could get out of the car and let Musk’s network put him to work.
“You’d just take the training wheels off,” he said. “I hope to be able to do that with this car one day.”