Tesla Full Self-Driving
Tesla driver blames FSD for nearly causing train crash, raising concerns about the system's safety and limitations. He highlights risk of driver complacency despite Tesla warnings. Twitter / Dolubatarya @DoluBatarya

Craig Doty II, a Tesla owner, narrowly avoided a collision after his vehicle, in Full Self-Driving (FSD) mode, allegedly steered towards an oncoming train.

Nighttime dashcam footage from earlier this month in Ohio captured the harrowing scene: Doty's Tesla rapidly approaching a train with no apparent deceleration. He insisted his Tesla was in Full Self-Driving mode when it barreled towards the train crossing without slowing down.

Dashcam footage shows the driver taking desperate action, swerving through the railway crossing sign and slamming on the brakes just meters from the oncoming train.

Tesla faces mounting legal challenges over its allegedly misleading Autopilot and Full Self-Driving capabilities. Owners claim these features malfunctioned by failing to stop for other vehicles or swerving into objects, causing crashes and some tragically resulting in fatalities.

FSD Malfunction? Driver Reports Close Call with Train

Taking to the Tesla Motors Club forum, Doty reported a troubling issue. Despite owning the car for less than a year, he claimed it has twice steered itself directly toward oncoming trains in FSD mode within the past six months.

Doty reportedly attempted to file a complaint and find similar cases but claims no lawyer would take it due to his lack of serious injuries. He mentioned experiencing backaches and a bruise.

Tesla acknowledges that low-light conditions, adverse weather such as rain or snow, direct sunlight, and fog can significantly impact performance. They strongly advise drivers to exercise caution and avoid using FSD in these scenarios.

Per the company, these conditions can hinder the functionality of Tesla's sensor suite, including ultrasonic sensors, which rely on high-frequency sound waves to detect surrounding objects. Low-light or poor weather can affect their effectiveness.

"The currently enabled Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous," the Elon Musk-led company notes in an article shared on its official website's Support page.

Additionally, FSD technology uses a combination of cameras and radar to perceive its surroundings. Cameras provide a 360-degree view and are crucial for traffic light and stop sign recognition.

Is Tesla's Autopilot Safe?

Under low-visibility conditions like fog or heavy rain, a Tesla vehicle's camera systems can be significantly hampered in their ability to detect the environment accurately.

Doty admitted to continuing to use FSD despite the prior incident. He said he'd developed a sense of trust in the system's ability to perform correctly, as he hasn't encountered any other problems. "After using the FSD system for a while, you tend to trust it to perform correctly, much like you would with adaptive cruise control," he said.

Doty noted that an FSD user could fall into the habit of assuming the car will slow down for slower traffic. Drivers should note that once the system fails, you are forced to take drastic action to avoid a crash, Doty warned. This false sense of security builds up because the system usually works, which makes these incidents even more alarming.

Tesla's manual clearly outlines driver expectations for FSD use. It emphasises the need for constant vigilance, including keeping hands on the wheel, monitoring road conditions and traffic, staying alert for pedestrians and cyclists, and being prepared to take immediate control.