Skip to main content
CPS Logo

Official Journal of Northwestern Center for Public Safety

The Key

The Human Hurdle: Human Error as a Roadblock to Early ADS Adaptation

by Victor Beecher
Automated Driving Systems (ADS) are expected to result in the greatest sea change in transportation since the invention of the automobile. Reaching a point of mass adaptation, though, is akin to a long-distance, high-hurdle steeplechase. One of the highest hurdles for ADS to clear is the human driver itself.

ADS innovations that have reached SAE’s Level 2, Partial Automation (see below), include adaptive cruise control, automated lane departure warnings, and automated parking, all of which are now standard or optional features in a wide range of vehicles manufactured since 2017. (Lucke) These advancements still rely on a driver to monitor the driving environment and “perform all remaining aspects of the dynamic driving task.” (SAE) For instance, while auto-parking a 2017 Ford Fusion, the driver must brake and be aware of traffic, even though the ADS is steering. According to SAE Chief Product Officer Frank Menchaca, “Right now, cars are accelerating, turning, and braking on their own, but they still require human attention and intervention. The literature suggests that when people aren’t fully responsible, their reaction times tend to get longer.” (Horaczek)

In SAE Levels 3 and up, the ADS monitors the driving environment as related to a specific automated driving mode, and a driver may or may not be requested to intervene. (SAE) Among these are the many vehicles now being tested by such companies as Ford, Toyota, Apple, Google, Waymo, and GM. In response to a fatal Tesla-involved collision in September 2017, the NTSB issued a release that summarize the human driver–related problems, including the following points: (NTSB)

  • “The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.
  • “If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed . . . the risk of driver misuse remains.
  • “[How] the Tesla Autopilot system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.
  • “[E]vidence revealed the Tesla driver was not attentive to the driving task.”

In the NTSB release, Chairman Robert L. Sumwalt III stated, “Smart people around the world are hard at work to automate driving, but systems . . . like Tesla’s Autopilot are designed to assist drivers . . . These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong.”

Most on-road testing is conducted in Arizona, California, Michigan, Nevada, and Pennsylvania — and these public, real-life tests show that human drivers of traditional vehicles now are the cause of most of the collisions involving an ADS vehicle. Since 2014, 104 incidents in California have involved ADS vehicles — 49 of those have occurred in 2018, to date.1 (Stewart). Of these incidents, 57% involved the ADS vehicle being rear-ended. In 29% of the incidents, the ADS vehicle was sideswiped. The causes at the root of most of these collisions include traditional driver frustration or distraction.

Kyle Vogt, cofounder and CEO at Cruise, notes that human drivers expect other drivers to “bend or break traffic rules, rolling through four-way intersections, accelerating to make a yellow light, or cruising over the speed limit.” But automated cars won’t break the rules. In the case of sideswipes, drivers of traditional vehicles have become frustrated, for instance, following behind an ADS car going 30 mph, where most traffic is pushing 40 mph. In attempts to squeeze around the ADS vehicle, traditional cars have sideswiped the ADS. As for the rear-end collisions, drivers following ADS vehicles don’t realize that they need to be much more attentive to the ADS cars’ making full stops at stop signs, stopping at crosswalks, and otherwise following the letter of the traffic law. (Stewart).

Chris Urmson, the head of Google’s self-driving car program (now called Waymo), has personal experience. In July 2015, he was in a Google self-driving car that was hit from behind by a traditionally driven vehicle. “Our self-driving cars are being hit surprisingly often by other drivers who are distracted and not paying attention to the road,” he wrote in his blog. “Other drivers have hit us 14 times since the start of our project in 2009 (including 11 rear-enders), and not once has the self-driving car been the cause of the collision. Instead, the clear theme is human error and inattention.” (Urmson)

One suggestion to improve the traditional vehicle vs ADS vehicle collision rate is to label the ADS vehicles, similarly to how some parents put “New Driver” signs in their rear windows when teaching their teenagers to drive. (Stewart).

Sumwalt stated, “While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles.” (NTSB) The term fully realized is significant. ADS developers acknowledge that true automated driving will not be possible until all motorized vehicles are automated. §

automated_driving-2.jpg
SAE's Levels of Vehicle Automation. Standard J3016. Copyright © SAE International.

 1 California is the only state that requires ADS-related companies to submit an annual report to the DMV.

Resources:
Back to top