In 10 months, nearly 400 car accidents in the United States involved advanced driver assistance technologies, the federal government’s top car safety regulator revealed on Wednesday, in its first large-scale release of data on these systems in the United States. growth.
In 392 incidents listed by the National Highway Traffic Safety Administration from July 1 last year to May 15, six people died and five were seriously injured. Teslas running on Autopilot, the most ambitious autopilot mode or any of its associated component features had 273 crashes. Five of those Tesla crashes were fatal.
The revelations are part of a comprehensive effort by the federal agency to determine the safety of advanced driving systems as they become more commonplace. Beyond the futuristic appeal of autonomous cars, dozens of automakers have launched automated components in recent years, including features that allow you to get your hands off the wheel under certain conditions and help you park in parallel.
“These technologies promise to improve safety, but we need to understand how these vehicles work in real-world situations,” said Steven Cliff, the agency’s administrator. “This will help our researchers quickly identify potential trends in emerging defects.”
Speaking to reporters ahead of Wednesday’s publication, Dr Cliff also warned that he did not draw conclusions from the data collected so far, noting that he did not take into account factors such as the number of cars from each manufacturer that are on the road and equipped with these types. of technologies.
“The data may raise more questions than they answer,” he said.
Some 830,000 Tesla cars in the United States are equipped with Autopilot or other company driver assistance technologies, which provides an explanation for why Tesla vehicles accounted for nearly 70 percent of reported accidents.
Ford Motor, General Motors, BMW and others have similar advanced systems that allow hands-free driving in certain road conditions, but far fewer of these models have been sold. These companies, however, have sold millions of cars over the past two decades that are equipped with individual components of driver assistance systems. Components include so-called lane keeping, which helps drivers stay in their lanes, and adaptive cruise control, which maintains car speed and brakes automatically when front-end traffic slows down.
In a statement on Wednesday, NHTSA revealed that Honda vehicles were involved in 90 incidents and Subarus in 10. Ford, GM, BMW, Volkswagen, Toyota, Hyundai and Porsche reported five or fewer each.
Dr Cliff said NHTSA would continue to collect data on accidents involving such features and technologies, noting that the agency would use it as a guide to setting rules or requirements on how they should be designed and used.
Data includes cars with advanced systems designed to operate with little or no driver intervention, and separate data on systems that can simultaneously steer and control the speed of the car but require constant driver attention.
In 11 crashes, a car with one of these technologies activated went straight and collided with another vehicle changing lanes, according to the data.
Fully automated vehicles, which are still under development for the most part but are being tested on the public highway, were involved in 130 incidents, NHTSA found. One caused serious injury, 15 minor or moderate injuries and 108 no injuries. Many of the crashes involving automated vehicles resulted in bumper bends or bumper taps, as they operate primarily at low speeds and in the city.
In more than a third of the accidents related to advanced systems, the car was stopped and hit by another vehicle.
Most of the incidents involving advanced systems occurred in San Francisco or the Bay Area, where companies such as Waymo, Argo AI and Cruise are testing and refining the technology.
Waymo, which is owned by Google’s parent company and operates a fleet of driverless taxis in Arizona, was among 62 incidents. Cruise, a division of GM, participated in 23. Cruise has just started offering driverless taxi rides in San Francisco, and this month received permission from the California authorities to start charging customers for driverless trips.
None of the cars using the most advanced systems were involved in fatal crashes, and only one crash caused a serious injury. In March, a cyclist struck a Cruise-operated vehicle from behind while they were both walking down a San Francisco street.
The data was collected under an NHTSA order issued a year ago requiring carmakers to report car crashes equipped with advanced driver assistance systems, also known as ADAS or automated driving systems. level 2.
The order was motivated in part by accidents and fatalities over the past six years involving Teslas operating on Autopilot. Last week, NHTSA expanded its investigation into whether Autopilot has technological and design flaws that pose safety risks. The agency has been investigating 35 accidents that occurred while activating autopilot, including nine that killed 14 people since 2014. It had also opened a preliminary investigation into 16 incidents in which Teslas under the control of autopilot was it crashed into emergency vehicles that had stopped and its lights flickered.
Problems with Tesla Autopilot System
Card 1 of 5
Safer driving claims. Tesla cars can use computers to handle some aspects of driving, such as changing lanes. But there are concerns that this driver assistance system, called Autopilot, is not safe. Here’s a closer look at the problem.
Driver assistance and accidents. A 2019 crash that killed a college student reveals how gaps in autopilot and driver distractions can have tragic consequences. In another accident, a Tesla crashed into a truck and killed a 15-year-old California boy. His family sued the company, alleging that the Autopilot system was partly responsible for it.
Shortcuts safe? Former Tesla employees said the carmaker could have undermined security by designing its Autopilot driver assistance system to fit the vision of Elon Musk, its chief executive. It was said that Mr. Musk insisted that the system relied solely on cameras to track a car’s surroundings, rather than using additional detection devices as well. Other companies’ autonomous vehicle systems often take this approach.
Information lake. The lack of reliable data also makes system security assessments difficult. Reports published by Tesla every three months suggest that accidents are less common with autopilot than without, but the figures can be misleading and do not take into account the fact that autopilot is mainly used for road driving. which is usually twice as safe as driving around town. streets.
The NHTSA’s order was an unusually bold move for the regulator, which has been criticized in recent years for not being more assertive with carmakers.
“The agency is gathering information to determine whether, in the field, these systems pose an irrational security risk,” said J. Christian Gerdes, a professor of mechanical engineering and director of the Automotive Research Center. Stanford University.
An advanced driver assistance system can steer, brake and accelerate vehicles on its own, although drivers must be alert and prepared to take control of the vehicle at any time.
Safety experts are concerned that these systems allow drivers to give up active control of the car and could make them think that their cars are driving themselves. When technology is malfunctioning or unable to handle a particular situation, drivers may not be ready to take control quickly.
Some independent studies have explored these technologies, but have not yet shown whether they reduce accidents or improve safety.
In November, Tesla recalled nearly 12,000 vehicles that were part of the beta test of Full Self Driving, a version of Autopilot designed for use on city streets, after deploying a software update that, according to the company, could cause accidents due to unexpected activation of cars. emergency braking system.
The NHTSA order required companies to provide accident data when using advanced driver assistance systems and automated technologies within 30 seconds after impact. While these data provide a broader picture of the behavior of these systems than ever before, it is still difficult to determine whether they reduce accidents or improve safety.
The agency has not collected data that allows researchers to easily determine whether using these systems is safer than shutting them down in the same situations. Car manufacturers were able to write descriptions of what happened during the accidents, an option that Tesla, as well as Ford and others regularly used, making it difficult to interpret the data.
“The question: what is the baseline we’re comparing this data to?” said Dr. Gerdes, a Stanford professor who was the first director of innovation in the Department of Transportation from 2016 to 2017, of which the NHTSA is a part.
But some experts say comparing these systems to human driving should not be the goal.
“When a Boeing 737 falls from the sky, we don’t ask ourselves, ‘Is it falling from the sky more or less than other planes?'” Said Bryant Walker Smith, an associate professor of law and the University of South Carolina. specializing in emerging transportation technologies.
“The crashes on our roads equate to several plane crashes each week,” he added. “The comparison is not necessarily what we want. If there are accidents to which these driving systems contribute, accidents that otherwise would not have happened, this is a potentially solvable problem that we must know.”
Jason Kao, Asmaa Elkeurti and Vivian Li contributed.