Tesla Motors is being investigated by the the National Highway Traffic and Safety Administration after a fatal crash in which the electric vehicle’s automated driving software failed to react to another car on the road.
The accident occurred on May 7 in Williston, Fla., according to federal regulators, who have opened a formal investigation into the crash. The investigators are looking at whether the electric vehicle’s Autopilot highway driving system was at fault.
Without naming the victim, Tesla called the accident “a tragic loss” in a statement. The Silicon Valley company said that it had informed regulators about the accident “immediately” after it occurred, though it only acknowledged the incident once the investigation became public on Thursday.
The driver was later identified as Joshua Brown, the owner of a technology consulting firm in Ohio, by the Florida Highway Patrol and confirmed by the New York Times.
Many companies, including Google and Tesla, have been aggressive in testing cars that combine computers, sensors, and radar to drive themselves on highways and city streets. Ford, BMW, and other traditional automakers are also pouring vast sums into self-driving technology. General Motors made a splash with its purchase of software maker Cruise Automation for $1 billion earlier this year.
As automated driving systems have evolved, questions have been raised about whether the industry’s eagerness has outpaced the technology. Most companies counter that, by taking driving out of human hands, autonomous cars will makes driving safer and reduce the accident epidemic in the United States.
According to data from the Insurance Institute for Highway Safety, there were almost 30,000 car deaths in the United States in 2014.
Tesla’s Autopilot software is meant only to handle highway driving, where there are fewer distractions for the sensors and cameras that gives the vehicle a detailed view of its surroundings. The technology enables cars to make automatic lane changes, detect other cars on the road, and apply the brakes before a collision occurs.
According to Tesla, the accident was the result of “extremely rare circumstances” where the driver’s Model S crashed into the midsection of a tractor trailer. The collision happened when the tractor trailer was making a left turn in front of Mr. Brown’s vehicle, and the car failed to apply the brakes.
“Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied,” the company explained in a statement.
Adding to Autopilot’s confusion was the height of the trailer, which caused the sensors to ignore what was actually in the vehicle’s path. Had the vehicle been heading for the front or back wheels, the system would have engaged the brakes.
The accident is a setback for Tesla, which has fiercely defended the safety of its Autopilot feature. Last month, its chief executive Elon Musk was went so far as saying that Autopilot was probably better than humans at driving.
In the news release, Tesla backpedaled slight from that statement, saying that Autopilot was still a test feature in the beta phase. It noted that drivers must agree to keep their hands on the wheel and prepared to take control before the system turns on.
“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” the company said.
The accident is likely to be the latest fuel for engineers and scientists who are at least ambivalent about the technology, and at worst fiercely opposed to it. In a recent survey from the University of Michigan, around 46% of drivers said that they would not want any autonomous driving mode in their vehicle.
Mary Cummings, a Duke University robotics professor, testified in a Senate hearing earlier this year that autonomous cars were “absolutely not ready for widespread deployment,” citing a glaring lack of research data on weather conditions, including rain and snow, that impair automotive sensors.
Cummings said that attempts to get autonomous cars on the road were “indicative of a larger problem in robotics, especially in self-driving cars and drones, where demonstrations are substituted for rigorous testing.”