Despite ruling Tesla will not have to recall its self-driving Autopilot software in the wake of a fatal crash in 2016, the US road safety regulator has urged the company to better inform drivers of the feature's limitations at the point of sale.
The US National Highway Traffic Safety Administration (NHTSA) ruled on 19 January that Autopilot was not to blame for the death of Tesla owner Joshua Brown, who died when his Model S crashed into a truck turning across the road in front of him. The truck's white colour and bright sunlight behind meant Autopilot's cameras failed to spot the obstacle in time, and despite having seven seconds to react, neither did Brown.
The agency said there is a degree of "confusion" over what Autopilot is and is not capable of. In simple terms, the system takes radar-guided cruise control available on other vehicles and adds autonomous steering, keeping the car in its lane and safely behind the vehicle in front without any physical involvement from the driver. Autopilot can also attempt to avoid accidents to some extent, but still requires the driver to be fully alert and ready to take back control.
NHTSA spokesperson Bryan Thomas said: "It's not enough to put it in an owners' manual and hope that drivers will read that and follow that. There should be clear training at the point of sale." Regarding the language used by Tesla and other manufacturers when referring to the self-driving system, Thomas added: "We encourage the industry to confront this as well."
Autopilot is engaged with two pulls of a level to the side of the steering wheel. It then controls the accelerator, brakes and steering to keep the car in lane and safely behind the car ahead. If that car brakes, Autopilot will brake, even to a standstill before setting off again. It will also switch lanes when the driver uses the indicator, and accelerate up to a driver-imposed speed limit where it is safe to do so. While technically possible, drivers are discouraged from taking their hands entirely off the wheel.
Regarding potential confusion among Tesla drivers over what Autopilot can do, Thomas said the marketing and naming of the system "has an impact on drivers' understanding of its capabilities". He added that there has been "confusion...and the use of the system outside of its intended domain."
Tesla said: "The safety of our customers comes first, and we appreciate the thoroughness of NHTSA's report and its conclusion."
Videos uploaded to YouTube have shown Tesla drivers engaging Autopilot while on roads without clear lane markings, and therefore unsuitable for the system. One appeared to show a Tesla Model S driving itself on a public road with no one in the passenger seat, while another featured a Tesla driving itself in stop-start traffic while the driver appeared to be asleep against the window.