Every three months, Tesla releases a safety report that provides the number of miles between crashes when drivers use the company’s driver assistance system, Autopilot, and the number of miles between crashes when they don’t.
These figures always show that accidents are less frequent with Autopilot, a set of technologies that can drive, brake and accelerate Tesla vehicles on their own.
But the numbers are misleading. Autopilot is primarily used for highway driving, which is generally twice as safe as driving on city streets, according to the Department for Transport. Fewer accidents may occur with autopilot simply because it is typically used in safer situations.
Tesla has not provided data to compare the safety of Autopilot on the same types of roads. They also don’t have other automakers that offer similar systems.
Autopilot has been on public roads since 2015. General Motors introduced Super Cruise in 2017 and Ford Motor introduced BlueCruise last year. But publicly available data that reliably measures the safety of these technologies is scarce. American drivers, whether using these systems or sharing the road with them, are effectively guinea pigs in an experiment whose results have yet to be revealed.
Automakers and technology companies are adding more features to vehicles that they claim improve safety, but these claims are hard to verify. Meanwhile, deaths on the country’s highways and streets have been rising in recent years, reaching a 16-year high in 2021. It would appear that any additional safety provided by technological advances is not making up for the poor decisions of the drivers behind the steering wheel.
“There is a lack of data that would give the public confidence that these systems, as implemented, deliver the expected safety benefits,” said J. Christian Gerdes, professor of mechanical engineering and co-director of the Center for Engineering at the Stanford University. Automotive Research, who was the Department of Transportation’s first Chief Innovation Officer.
GM collaborated with the University of Michigan on a study that explored the possible safety benefits of Super Cruise, but concluded that they did not have enough data to understand whether the system reduced accidents.
A year ago, the National Highway Traffic Safety Administration, the government’s auto safety regulator, ordered companies to report potentially serious accidents involving advanced driver-assistance systems like Autopilot within a day of learning about them. from them. The order said the agency would make the reports public, but it has not yet done so.
The security agency declined to comment on what information it had collected so far, but said in a statement that the data would be released “in the near future.”
Tesla and its CEO, Elon Musk, did not respond to requests for comment. GM said it had reported two Super Cruise-related incidents to the NHTSA: one in 2018 and one in 2020. Ford declined to comment.
The agency’s data is unlikely to provide a complete picture of the situation, but it could encourage lawmakers and drivers to take a closer look at these technologies and ultimately change the way they are marketed and regulated.
“To solve a problem, you first have to understand it,” said Bryant Walker Smith, an associate professor in the University of South Carolina schools of law and engineering who specializes in emerging transportation technologies. “This is a way to get more truth on the ground as a basis for investigations, regulations and other actions.”
Despite its capabilities, Autopilot does not remove responsibility from the driver. Tesla tells drivers to stay alert and be ready to take control of the car at all times. The same goes for BlueCruise and Super Cruise.
But many experts worry that these systems, because they allow drivers to relinquish active control of the car, could lead them to believe their cars are driving themselves. Then, when technology fails or can’t handle a situation on its own, drivers may not be prepared to take control as quickly as needed.
Older technologies like automatic emergency braking and lane departure warning have long provided safety nets for drivers by slowing or stopping the car or warning drivers when they drift out of their lane. But newer driver assistance systems change that arrangement by making the driver the safety net for the technology.
Security experts are particularly concerned about Autopilot because of the way it’s marketed. For years, Musk has said that the company’s cars were on the verge of true autonomy: driving themselves in virtually any situation. The name of the system also implies an automation that technology has not yet achieved.
This can lead to driver complacency. Autopilot has played a role in many fatal accidents, in some cases because drivers were not prepared to take control of the car.
Musk has long promoted Autopilot as a way to improve safety, and Tesla’s quarterly safety reports seem to back him up. But a recent study by the Virginia Transportation Research Council, an arm of the Virginia Department of Transportation, shows that these reports are not what they seem.
“We know that cars using Autopilot crash less often than when Autopilot isn’t used,” said Noah Goodall, a council researcher who explores safety and operational issues related to autonomous vehicles. “But are they being driven in the same way, on the same roads, at the same time of day, by the same drivers?”
How Elon Musk’s Twitter deal unfolded
A very successful deal. Elon Musk, the world’s richest man, has capped off what seemed like an unlikely attempt by the famously fickle billionaire to buy Twitter for roughly $44 billion. Here’s how the deal unfolded:
Analyzing police and insurance data, the Insurance Institute for Highway Safety, a nonprofit research organization funded by the insurance industry, found that older technologies such as automatic emergency braking and warning lane change, have improved safety. But the organization says that studies have not yet shown that driver assistance systems provide similar benefits.
Part of the problem is that police and insurance data do not always indicate whether these systems were in use at the time of the accident.
The federal agency for auto safety has ordered companies to provide data on crashes when driver-assist technologies were in use within 30 seconds of impact. This could provide a bigger picture of how these systems are performing.
But even with that data, security experts said, it will be difficult to determine whether using these systems is safer than turning them off in the same situations.
The Alliance for Automotive Innovation, a trade group for auto companies, warned that data from the federal safety agency could be misinterpreted or misrepresented. Some independent experts express similar concerns.
“My big concern is that we will have detailed data on crashes involving these technologies, without comparable data on crashes involving conventional cars,” said Matthew Wansley, a professor at Cardozo Law School in New York who specializes in emerging automotive technologies. and was previously general counsel to an autonomous vehicle startup called nuTonomy. “It could potentially appear that these systems are much less secure than they really are.”
For this and other reasons, automakers may be reluctant to share some data with the agency. Depending on your order, companies can ask you to withhold certain data on the grounds that it would reveal trade secrets.
The agency is also collecting crash data on automated driving systems, more advanced technologies that aim to weed drivers out of cars altogether. These systems are often called “autonomous cars.”
For the most part, this technology is still being tested in a relatively small number of cars with drivers behind the wheel for backup. Waymo, a company owned by Google parent Alphabet, operates a driverless service in suburban Phoenix, with similar services planned in cities including San Francisco and Miami.
Companies are already required to report crashes involving automated driving systems in some states. Data from the federal security agency, which will cover the entire country, should also provide additional information in this area.
But the more immediate concern is the safety of Autopilot and other driver assistance systems, which are installed in hundreds of thousands of vehicles.
“There is an open question: Is Autopilot increasing the frequency of accidents or decreasing it?” Mr. Wansley said. “We may not get a complete answer, but we will get useful information.”