We were promised a very near future where self-driving cars would serve our needs and vehicle ownership would be rendered unnecessary.
Robots would quickly and efficiently deliver our orders. People could squeeze in a few more hours of work or sleep while being chauffeured in self-driving cars. Older adults who had given up driving would have another option to get around.
Progress has been made, at least, on some of this. University campuses and cities across North America have indeed witnessed the growing presence of small food-delivery robots. Likewise, new partnerships have recently been announced to develop and test the safety of self-driving trucks.
The journey toward autonomous or self-driving consumer cars, on the other hand, has arguably come to a screeching halt. In 2021, top industry experts recognized that developing safe autonomous driving systems was not as simple as anticipated. Elon Musk himself conceded that developing the technology required to deliver safe self-driving cars has proved harder than he thought.
More bad news came in mid-June when the U.S. National Highway Traffic Safety Administration (NHTSA) released numbers showing Tesla vehicles were responsible for nearly 70% of crashes involving SAE Level 2 cars.
How They Work
Some cars are completely autonomous and are capable of driving without any input from a human driver. For example, Waymo One, in Phoenix, Ariz., is a ride-hailing service that uses autonomous cars on a test route.
SAE Level 2 cars, like Tesla Autopilot, require human drivers to stay alert at all times, even when the system temporarily takes control of steering and acceleration. When the system determines traffic or road conditions aren’t adequate for it to operate, it gives control back to the driver. At that point, the driver needs to take over manual control of the vehicle.
Human factors engineering is a cross-disciplinary research field investigating how humans interact with vehicle technology. Its researchers have, for years, highlighted the safety risks of automated driving. Risks are worse when the system requires the driver to make up for technological shortcomings.
That is the automation paradox: The more automated the vehicle, the harder it is for humans to operate it properly.
Among the most prominent risks of operating SAE Level 2 cars is when drivers misunderstand the capabilities of the automated system. The issue often leads to unsafe behaviors like reading a book or taking a nap while the vehicle is in motion.
New Study on Crashes
In 2021, there were so many reports of unsafe behaviors at the wheel of Level 2 cars that the NHTSA required manufacturers to start reporting crashes that had occurred when these systems were engaged.
The initial findings were released in June 2022. They showed that since 2021, Tesla and Honda vehicles were involved in 273 and 90 reported crashes, respectively, when these systems were engaged. Most crashes occurred in Texas and California.
These data paint a dismal picture of the safety of these systems. But they pale in comparison to the over 40,000 reported fatal crashes that occurred in the United States in 2021 alone.
As part of the same report, NHTSA itself highlights some of the methodological limitations of the study: from the incompleteness of some of the source data to failing to account for individual manufacturers’ total vehicle volume or distance traveled by vehicles.
For the skeptics, this does not spell the end of autonomous cars. It does, however, confirm that widespread use of safe self-driving cars is not years, but decades, away.
This article was provided by The Conversation via Reuters.