Oh, Look—Another Tesla FSD Failure NHTSA Needs To Investigate

Tesla’s Full Self-Driving system is in the headlines again after two U.S. senators started making noise about its alleged struggles to detect and respond to railroad crossings.
Key Points
- Senators warn Tesla’s Full Self-Driving may fail to detect and respond to railroad crossings, potentially leading to deadly multi-vehicle crashes.
- NHTSA is already probing millions of Teslas with FSD after reported crashes in poor visibility, adding to concerns about the system’s safety.
- Lawmakers urge regulators to restrict FSD to specific road and weather conditions until its safety can be proven.
The senators sent a public letter earlier this week—before the shutdown, naturally—to the National Highway Traffic Safety Administration (NHTSA). Democratic Senators Ed Markey and Richard Blumenthal urged the safety regulator to launch an investigation into Tesla’s FSD system. The senators highlighted several reported near-miss incidents at train crossings.
At least six Tesla owners experienced near-collisions or crashes while FSD was engaged at or near train tracks, including one report that a Tesla using FSD drove through an active railroad crossing arm, sending the car skidding off the road near an oncoming train.
“Although mistakes such as a missed traffic sign or an illegal lane change are dangerous, a miscalculation at a train crossing can lead to catastrophic, multi-fatality collisions involving vehicle occupants, train passengers, and rail workers,” the senators wrote.
Tesla promotes Full Self-Driving as an advanced driver-assistance system capable of handling everything from lane changes to parking, though it still requires human supervision, the system has always struggled when conditions are less than ideal, and most people are really bad at paying attention once you let them off the hook.
The senators are now pushing NHTSA to consider limiting the use of FSD if the safety risks prove credible. “The agency should consider clear and obvious actions to protect the public, including restricting Tesla’s FSD to the road and weather conditions it was designed to operate in,” the letter reads. "Moreover, NHTSA’s previous investigations into FSD show that the system’s failures are not isolated. Tesla's system has been shown to misinterpret basic traffic infrastructure, particularly in low visibility or complex roadway conditions."
Late last year, the system was formally put under investigation when NHTSA began looking at 2.4 million Teslas after four collisions in low-visibility conditions such as fog, glare, and dust. One of those incidents was fatal.
A modified version of FSD underpinds Tesla's Robotaxi service, which, unsurprisingly, has also caught the attention of NHTSA for poor driving standards, including several crashes in the service's first month of operation.
Unlike competitors that rely on lidar, radar, and camera fusion, Tesla's FSD uses cameras alone. The decision leaves the system vulnerable to blind spots and environmental factors. Regulators also don’t consider FSD to be a true self-driving system, as it still requires a human driver to remain alert and intervene when needed.
While we're at it, let's highlight the pair of Model Y owners who recently attempted to drive from Los Angeles to Jacksonville, Florida, using FSD the entire way. Their trip came to a halt just 60 miles into the voyage when the crossover hit some kind of debris in the road after the FSD camera suite failed to identify the risk and change course, or stop. The impact broke the Model Y's suspension and caused considerable underbody damage.
The driver should have probably jumped in and made a move, but that highlights the issue with assuming drivers will intervene from a passively relaxed state—their nervous system isn't primed to make a reactive move because they've sat back to allow the car to do the driving for them.
Become an AutoGuide insider. Get the latest from the automotive world first by subscribing to our newsletter here.

An experienced automotive storyteller and accomplished photographer known for engaging and insightful content. Michael also brings a wealth of technical knowledge—he was part of the Ford GT program at Multimatic, oversaw a fleet of Audi TCR race cars, ziptied Lamborghini Super Trofeo cars back together, been over the wall during the Rolex 24, and worked in the intense world of IndyCar.
More by Michael Accardi
Comments
Join the conversation
Unless I´m handicaped, I will never buy a self driving system/car. I trust more in myself than in a machine.
I can't understand why anyone would trust their safety to a self-driving car.