×

Autonomous vehicles need to avoid human-like errors

Driver mistakes play a role in virtually all crashes. That’s why automation has been held up as a potential game changer for safety. But autonomous vehicles might prevent only around a third of all crashes if automated systems drive too much like people, according to a new study from the Insurance Institute for Highway Safety.

Conventional thinking has it that self-driving vehicles could one day make crashes a thing of the past. According to a national survey of police-reported crashes, driver error is the final failure in the chain of events leading to more than 9 out of 10 crashes. But the Institute’s analysis suggests that only about a third of those crashes were the result of mistakes that automated vehicles would be expected to avoid simply because they have more accurate perception than human drivers and aren’t vulnerable to incapacitation. To avoid the other two-thirds, they would need to be specifically programmed to prioritize safety over speed and convenience.

“Building self-driving cars that drive as well as people do is a big challenge in itself,” says IIHS Research Scientist Alexandra Mueller, lead author of the study. “But they’d actually need to be better than that to deliver on the promises we’ve all heard.”

Consider the crash of an Uber test vehicle that killed a pedestrian in Tempe, Arizona, in March 2018. Its automated driving system initially struggled to correctly identify 49-year-old Elaine Herzberg on the side of the road. But once it did, it still was not able to predict that she would cross in front of the vehicle, and it failed to execute the correct evasive maneuver to avoid striking her when she did so.

For the study, the researchers imagined a future in which all the vehicles on the road are self-driving. They assumed these future vehicles would prevent those crashes that were caused exclusively by perception errors or involved an incapacitated driver. That’s because cameras and sensors of fully autonomous vehicles could be expected to monitor the roadway and identify potential hazards better than a human driver and be incapable of distraction or incapacitation.

The fact that deliberate decisions made by drivers can lead to crashes indicates that rider preferences might sometimes conflict with the safety priorities of autonomous vehicles. For self-driving vehicles to live up to their promise of eliminating most crashes, they will have to be designed to focus on safety rather than rider preference when those two are at odds.

Self-driving vehicles will need not only to obey traffic laws but also to adapt to road conditions and implement driving strategies that account for uncertainty about what other road users will do, such as driving more slowly than a human driver would in areas with high pedestrian traffic or in low-visibility conditions. “It will be crucial for designers to prioritize safety over rider preferences if autonomous vehicles are to live up to their promise to be safer than human drivers,” Mueller says.

The information in this article is courtesy of the IIHS and the July 22 issue of Status Report.

NEWSLETTER

Today's breaking news and more in your inbox

I'm interested in (please check all that apply)
Are you a paying subscriber to the newspaper? *

Starting at $4.75/week.

Subscribe Today