Tesla FSD Found To Be Erratic And Dangerous
Tesla’s much-anticipated RoboTaxi event is just weeks away, but new findings from independent research firm AMCI Testing could cast a shadow over the announcement.
AMCI has completed what it calls the “ most extensive real-world test” of Tesla’s Full Self-Driving (FSD) software to date, running the system through over 1,000 miles of mixed driving conditions. The results are concerning: on average, FSD required human intervention at least once every 13 miles to maintain safe operation.
While Tesla markets FSD as a nearly autonomous system, AMCI found that the software’s performance was erratic and, at times, dangerous. One of the most worrying takeaways from the report was that when FSD made errors, they tended to be sudden and potentially catastrophic. This leads to questions about whether Tesla’s autonomous tech is really ready for prime time.
This could be a blow to Tesla’s upcoming RoboTaxi launch, which hinges on the reliability of its driverless systems. Despite these setbacks, Tesla continues to sell FSD as a five-figure option, leaving some drivers wondering if they’re just unpaid beta testers.
Become an AutoGuide insider. Get the latest from the automotive world first by subscribing to our newsletter here.
An experienced automotive storyteller and accomplished photographer known for engaging and insightful content. Michael also brings a wealth of technical knowledge—he was part of the Ford GT program at Multimatic, oversaw a fleet of Audi TCR race cars, ziptied Lamborghini Super Trofeo cars back together, been over the wall during the Rolex 24, and worked in the cut-throat world of IndyCar.
More by Michael Accardi
Comments
Join the conversation
Anyone who thinks Tesla FSD is safe is not in touch with reality. ANY vehicle that wants to be certified as FSD needs to go through third party testing, and fully pass. Any less than that puts everyone on the road at risk.
not just Tesla but all that are following them