Tesla’s Full-Self Driving Software Is A Mess. Should It Be Legal?

Elon Musk keeps hyping the AI-enabled software, and getting more people to buy it is key to his massive new pay package. But in a recent test, it ignored standard street signs and even a flashing school bus stop sign – squashing mannequin child “Timmy.”


Elon Musk relentlessly promotes Tesla as a major player in autonomous driving, both in the robotaxi market and for personally owned vehicles that are based on its Full Self-Driving system. His jaw-dropping $1 trillion pay package includes getting 1 million Tesla robotaxis on the road and 10 million active FSD users over the next decade, so their success directly benefits him financially.

Whether that’s achievable remains to be seen, but an assessment by Forbes of the latest version of FSD found that it remains error-prone. During a 90-minute test drive in Los Angeles, in residential neighborhoods and freeways, the 2024 Model Y with Tesla’s latest hardware and software (Hardware 4, FSD version 13.2.9) ignored some standard traffic signs and posted speed limits; didn’t slow at a pedestrian crossing with a flashing sign and people present; made pointless lane changes and accelerated at odd times, such as while exiting a crowded freeway with a red light at the end of the ramp. There’s also no indication the company has fixed a worrisome glitch identified two years ago: stopping for a flashing school bus sign indicating that children may be about to cross a street.

In fact, there are so many easy-to-find problems with the feature, recently redubbed “Full-Self Driving (Supervised),” it raises a question: Why is the $8,000 feature, which also requires a $99 a month subscription, even legal in its current form?

Turns out, there’s a simple answer: “Driving-assist systems are unregulated, so there are no concerns about legality,” said Missy Cummings, a George Mason University professor and AI expert who has advised the National Highway Traffic Safety Administration on autonomous vehicles. “NHTSA has the authority to step in, but up to now they’ve only stepped in for poor driver monitoring.”

Road Closed and Do Not Enter Signs

Nine years after the first Tesla owner in the U.S. was killed in an accident while using the company’s Autopilot driver-assist software – Joshua Brown in May 2016 – U.S. regulators have yet to set clear rules for so-called Level 2 driver-assist technology – the category FSD falls into – beyond requiring in-cabin monitoring to ensure human drivers pay attention to road conditions. This loophole creates an opportunity for Musk to promote FSD, first rolled out in 2020, as a feature that provides virtually autonomous driving with “minimal” human intervention, according to the in-car display. And as a metric for his future compensation package, its broader adoption is massively lucrative for him.

NHTSA, which opened an investigation last month into Tesla’s failure to report FSD and Autopilot accidents in a timely manner, said it “does not pre-approve new technologies or vehicle systems.” Instead, it’s up to carmakers to certify that vehicles and technologies meet federal safety standards. If an investigation finds a system to be unsafe, “NHTSA will take any necessary actions to protect road safety,” a spokesperson said.

Musk’s comments about FSD’s capabilities aren’t nuanced: “Tesla self-driving massively improves your quality of life and safety for the thousands of life hours you’re in a car.” The company also promotes the system with videos showing seemingly flawless performance in road trips. Yet Tesla is more circumspect in what it tells NHTSA. “Tesla describes its systems as Level 2 partial automation (including “FSD Supervised”), requiring a fully attentive driver who is engaged in the driving task at all times,” the agency said.

Scrutiny of Tesla’s technology is increasing after legal setbacks, including the launch of a federal class-action lawsuit by Tesla owners over Musk’s exaggerated FSD and Autopilot claims and efforts by California’s DMV to bar the company from using those names for the features in that state. The jury in a federal trial in Florida also determined last month that Tesla was partially responsible for a fatal 2019 crash that occurred while its Autopilot feature was engaged, ordering it to pay $243 million in damages. The company is appealing the verdict; it settled two other lawsuits last week over crashes in California linked to Autopilot, which has more limited driving-assist features than FSD.

The Austin-based company is also operating a pilot robotaxi project in its home city for a limited number of Tesla owners who are charged a nominal fee to use it. The version of FSD it uses has slight modifications to the version offered to customers, though it hasn’t explained those in detail. Since launching the service in June, Tesla has run into a number of problems, including reports of three crashes in one day in July.

Unlike Waymo, which operates its robotaxis in full autonomous mode in multiple U.S. cities, there’s currently a safety driver in the front seat to monitor FSD’s performance.

“This is an alpha-level product. It should never be in the customer’s hands. It’s just a prototype. It’s not a product.”

Dan O’Dowd, Dawn Project Founder

There are 59 fatalities from crashes involving the use of Autopilot and FSD, according to a running tally by TeslaDeaths.com, compiled from news reports.

Musk and Tesla didn’t respond to a request for comment on FSD’s safety issues, but an updated version of the software is expected soon.

Flashing Pedestrian Sign


Forbes recently did a 90-minute assessment of Tesla’s most recent FSD, updated on Aug. 28, in a 2023 Model Y with Tesla’s latest hardware. The vehicle was owned by the Dawn Project, an organization created by software developer Dan O’Dowd, who in recent years has become a vocal critic of FSD, even spending his own money on Super Bowl commercials to raise awareness of its flaws.

In its current form, “this is not even a beta system. This is an alpha-level product. It should never be in the customer’s hands,” O’Dowd told Forbes. “It’s just a prototype. It’s not a product.”

FSD, at times, can feel like an autonomous driving system. Punch in an address, and it takes off with no problems in hands-free driving mode. It signals when making a turn, stops at stop signs and traffic lights, and is generally observant of its surroundings, based on virtual images displayed on the center console. Yet the frequency of errors that occur, even in light traffic and in optimal weather conditions, particularly on urban streets, means a human behind the wheel must be highly attentive and ready to take control. And because of that, if you are at all a mindful driver, using it is not simpler or more relaxing than when a human is driving.

Two years ago, O’Dowd’s Dawn Project conducted a simple test of FSD involving a parked school bus. As the Tesla approached, a lighted stop sign on the side of the bus flipped out, alerting drivers to stop to allow children to safely pass. In the original test, the Tesla failed to stop every time, not even slowing when a child-sized mannequin crossed its path at the front of the bus. This month Forbes replicated the test and the Tesla failed again; it didn’t stop for the warning sign and it once again ran down “Timmy,” the Dawn Project’s mannequin.

Shortly afterward, we repeated the test with a Waymo summoned directly from the app. The car stopped for the sign; it didn’t move until it was retracted, and it did not run over Timmy.

School Bus Stop Sign And Timmy

“We reported the Tesla school bus thing over two years ago and they didn’t do anything about it,” O’Dowd said. “We put it in the New York Times in a full-page ad. Then we did a Super Bowl commercial showing it. … But [Tesla] didn’t do anything about it.”

And it’s not just flashing school bus signs. FSD also doesn’t appear to stop at train tracks when the crossing gate comes down and lights are flashing, according to some owners. Reports of disturbing glitches are not uncommon. One owner said his Tesla Model 3 with FSD version 13.2.9 suddenly stopped midway through executing an unprotected left turn, with an oncoming vehicle headed for him.

“I have this car bearing down on me at about 45mph, and I’m stopped, hanging out halfway in its lane,” according to a Tesla Motors Club post. “I fairly quickly pressed the accelerator to override FSD when I got my next surprise. The car quickly moved… in reverse!” Fortunately, that allowed the car to avoid the oncoming car and there was no one behind the owner, who was rattled by the experience.

In its evaluation, auto reviewer and researcher Edmunds finds improvements in FSD software, but “annoying” problems. “It won’t make slight adjustments within the lane to avoid objects in the road, such as roadkill, blown-out tire debris or potholes,” Edmunds said. “That is not great, but it’s made worse because FSD doesn’t like you taking control to make those corrections either. Turn the steering wheel just a bit too much to avoid an object in the road and the entire system abruptly turns off.”

Edmunds doesn’t recommend that customers pay $8,000 for FSD in its current form.

FSD’s issues avoiding road debris were more than annoying for two Tesla influencers trying to complete Musk’s promise of having a vehicle drive autonomously across the U.S., first made in 2016. About 60 miles after setting out from San Diego, their new Model Y failed to avoid a large metal object in the middle of their highway lane, causing severe damage to the vehicle’s underbody, according to a video posted by one of the influencers, Bearded Tesla Guy.

Neither Consumer Reports nor the Insurance Institute for Highway Safety, whose assessments are heavily influential on the auto industry and regulators, has detailed evaluations of FSD. However, IIHS rates Tesla’s driver-alert features for FSD to be poor.

NHTSA has the authority to issue a “stop sale” notice to automakers in the event it finds significant safety problems for specific models or features. In 2016, it sent a “special order” letter to startup Comma.ai over an after-market product the company was selling to partially automate driving for certain auto models. That led Comma to halt plans to sell the device at the time.

NHTSA should do something similar with FSD, according to O’Dowd, though it’s unclear if it will. For now, the Dawn Project is providing driving evaluations like the one conducted for Forbes for public officials, such as California Attorney General Rob Bonta and U.S. Representative Salud Carbajal, to demonstrate the feature’s shortcomings. Green Hills Software, his company, is a long-time supplier of software to the Defense Department and aircraft manufacturers, and funds the Dawn Project. He declined to detail its current budget, but said it’s “substantial.”

“A drug company wouldn’t call something a universal, full cancer cure when it didn’t actually cure cancer. No one would do that. You would be sued into the ground. You’d be thrown in jail and they’d take everything away from you,” O’Dowd said. “But [Musk] does it every day because no one in the government will take action. No regulators will take action at this point. That’s kind of what we’re here for.”

Mark Rosekind, former chief safety officer for robotaxi developer Zoox and the NHTSA administrator in 2016 when the first fatal Tesla Autopilot crash occurred, thinks a combination of new regulations for technology like FSD and validation by expert outside entities is needed.

“If you really want safety, transparency and trust in autonomous vehicle opportunities, you’re going to need to strengthen and enhance the federal structures with really innovative approaches,” he told Forbes. That should include “complementary third-party independent, neutral programs …certain requirements that they go through to demonstrate the safety. Transparency that will build trust.”

For now, autonomous vehicle researcher Cummings sees a factor that may minimize the shortcomings of Tesla’s tech. “The one really good thing about how bad FSD is that most people understand it is terrible and watch it very closely,” she said.

More from Forbes

ForbesElon Musk’s Self-Driving Tesla Lies Are Finally Catching Up To HimForbesWaymo Is A Trillion-Dollar Opportunity. Google Just Needs To Seize It.ForbesElon Musk’s Robotaxi Dream Could Be A Liability Nightmare For Tesla And Its Owners

Source: https://www.forbes.com/sites/alanohnsman/2025/09/23/teslas-full-self-driving-software-is-a-mess-should-it-be-legal/