What We Know About Fatal Tesla Accident

Numerous media stories have reported the first fatality in a self-driving car. The most important thing to know is that the Tesla that was involved in the crash was not a self-driving car, that is, a car that “performs all safety-critical functions for the entire trip” or even a car in which “the driver can fully cede control of all safety-critical functions in certain conditions” (otherwise known as “level 4” and “level 3” cars in the National Highway Traffic Safety Administration’s classification of automated cars). 

Instead, the Tesla was equipped with an Advanced Driver Assistance System (ADAS) that performs some steering and speed functions but still requires continuous driver monitoring. In the NHTSA’s classification, it was a “level 2” car, meaning it automated “at least two primary control functions,” in this case, adaptive cruise control (controlling speeds to avoid hitting vehicles in front) and lane centering (steering within the stripes). BMW, Mercedes, and other manufacturers also offer cars with these functions, the difference being that the other cars do not allow drivers to take their hands off the wheel for more than a few seconds while the Tesla does. This may have given some Tesla drivers the impression that their car was a level 3 vehicle that could fully take over “all safety-critical functions in certain conditions.”

The next most important thing to know about the crash is that the Florida Highway Patrol’s initial accident report blamed the accident on the truck driver’s failure to yield the right-of-way to the Tesla. When making a left turn from the eastbound lanes of the highway, the truck should have yielded to the westbound Tesla. Still, it is possible if not likely that the accident would not have happened if the vehicle’s driver had been paying full attention to the road.

Mobileye, the company that made the radar system used in the Tesla, says that its system is designed only to prevent a car from rear-ending slower-moving vehicles, not to keep them from hitting vehicles laterally crossing the car’s path. Even if the sensors had detected the truck, automatic braking systems typically can come to a full stop only if the vehicle is traveling no more than 30 miles per hour faster than the object. Since the road in question is marked for 65 miles per hour, the system could not have stopped the Tesla.

Thus, the Tesla driver who was killed in the accident, Joshua Brown, probably should have been paying more attention. There are conflicting reports about whether Brown was speeding or was watching a movie at the the time of the accident. Neither were mentioned in the preliminary accident report, but even if true it doesn’t change the fact that the Tesla had the right of way over the truck.

Just two months before the accident, Duke University roboticist Missy Cummings presciently testified before Congress that auto companies were “rushing to market” before self-driving cars are ready, and “someone is going to die.” She didn’t mention Tesla by name, but since that is so far the only car company that allows American drivers to take their hands off the wheel for more than a few seconds, she may have had it in mind.

Tesla’s autopilot system relies on two forward-facing sensors: a non-stereo camera and radar. Tests by a Tesla owner have shown that the system using these sensors will not always stop a vehicle from hitting obstacles in the road. By comparison, the Mercedes and BMW systems use a stereo camera (which can more quickly detect approaching obstacles) and five radar sensors (which can detect different kinds of obstacles over a wider range). Thus, in allowing drivers to take hands off the steering wheel, Tesla may have oversold its cars’ capabilities.

The day before information about the Tesla accident became publicly known, the National Association of City Transportation Officials issued a policy statement about self-driving cars urging, among other things, that drivers not be allowed to use “partially automated vehicles” except on limited access freeways because “such vehicles have been shown to encourage unsafe driving behavior.” While this would have prevented the Tesla crash, it ignores the possibility that partial automation might have net safety benefits overall.

A few days after the accident became publicly known, NHTSA announced that traffic fatalities had increased by 7.7 percent in 2015, the largest increase in many years. As Tesla CEO Elon Musk somewhat defensively pointed out, partial automation can probably cut fatalities in half, and full automation is likely to cut them in half again. State and federal regulators should not allow one accident in an ADAS-equipped car to color their judgments about true self-driving cars that are still under development.