Humans must take active role in self-driving safety, report says

Advocates of self-driving vehicles say the technology can have an enormous effect on traffic safety, since the vast majority of crashes are caused by human error. By removing human input, autonomous vehicles have the potential to drastically reduce, if not eliminate, the deaths and injuries caused by distraction, fatigue, and driver impairment. Autonomous vehicles might also improve mobility for people who otherwise wouldn't be able to drive.

However, driverless vehicles have also raised questions about traffic policy and how the autonomous systems will react in certain situations. A new report by the Governors Highway Safety Association stresses that humans will still have an important impact on traffic safety as increasingly automated vehicles are introduced.

The GHSA report—"Preparing for Automated Vehicles: Traffic Safety Issues for States"—was funded by the insurance company State Farm and written by Jim Hedlund, a former senior official at the National Highway Traffic Safety Administration. Researchers concluded that humans will continue to have significant control over vehicles in the near future, and will still influence traffic safety as vehicles with a higher level of autonomy start to coexist with older vehicles with little or no autonomy.

"Imperfect human drivers aren't disappearing anytime soon and even with self-driving technology, they will still be in a position to cause crashes, deaths, and injuries on our roads," said Jonathan Adkins, executive director of the GHSA. "As autonomous vehicle technology advances, states still must invest in programs to prioritize safe travel behavior."

There are five levels of vehicle automation defined by the Society of Automotive Engineers and NHTSA. Level 2 vehicles, whose features can maintain a vehicle's speed and lane position, are currently available. Drivers can take their hands off the steering wheel and allow the vehicle to drive itself, but automakers discourage this behavior since human input may be needed at any time.

One key concern with partial automation is that drivers may rely too heavily on the vehicle features, increasing the possibility of a crash. Semi-autonomous vehicles also require drivers to monitor the road and take over when prompted by the vehicle, but drivers who are distracted by other tasks while the vehicle is driving itself may not respond promptly.

In Level 3 automation, the vehicle can take full control in certain situations and will inform the driver when they need to take over. Some automakers consider that this creates too much of a risk that the drivers will be distracted when they need to take over, and thus don't plan to make Level 3 vehicles available to the public. Instead, they'll look to advance to Level 4 (which can be in control for the entire trip) and Level 5 (in which the vehicle can drive itself without any human occupants present).

Several states have been testing autonomous vehicles, or authorized studies on whether to allow it, but the GHSA concluded that fewer than half of the states have encouraged this activity. In Connecticut, the testing of self-driving vehicles is allowed if a human driver is present. The report recommends that states encourage the testing of autonomous vehicles but also ensure that the tests are subject to oversight, regulations, and coordination with law enforcement.

The report notes that many people are skeptical about driverless vehicles. Drivers often don't trust that a vehicle system can competently navigate a route, and recent incidents—including a pedestrian killed by a self-driving vehicle and fatal crashes involving drivers who relied too heavily on advanced autonomous systems—have also resulted in increased wariness toward the technology.

"Many people are unconvinced of the safety benefits of AVs and unwilling to share the road or to ride in them," said Ryan Gammelgard, counsel at State Farm. "However, research suggests that public enthusiasm and support will grow as people learn more about AVs and are able to experience them firsthand, and if there is objective proof that the technology operates better than humans."

The report says there are also a number of potential situations which must be addressed to improve the safety and functionality of self-driving vehicles. These include the possibility of unlicensed drivers activating an autonomous vehicle, but not being able to take over when prompted; how autonomous vehicles will respond to police commands; the potential for criminals to stop and self-driving vehicles and rob their occupants; how driverless vehicles will react in situations where they need to take evasive action, including potentially having to decide whether to strike a pedestrian or risk harm to the vehicle occupants by swerving into an obstacle; determining fault in traffic incidents, including whether an autonomous system was active and whether the driver was monitoring the road; and how autonomous vehicles might interpret pedestrian behaviors.

"States need to consider a number of new issues related to the practical deployment of this technology," said Hedlund. "One of the most important goals should be to educate the public about the benefits and risks of this technology, how to use it safely, and drive near AVs in traffic."

The GHSA recommended that states consider laws to be implemented as autonomous vehicles become more prevalent, including requiring a licensed driver to be present in the vehicle to take control if need be. The report also suggested that law enforcement agencies should develop procedures for handling driverless vehicles in traffic stops and crashes.

In addition, the GHSA said programs to educate autonomous vehicle owners and operators could make to make these drivers more aware of the capabilities and limitations of the vehicles' systems, as well as the responsibilities human drivers still have when driving this type of vehicle. The report says this information could be incorporated into states' driver education courses and licensing exams.


Loading comments...
Hide Comments