Self-driving car industry confronts trust issues after Uber crash

March 23, 2018 - 10:40 AM
6011
U.S. National Transportation Safety Board (NTSB) investigators examine a self-driving Uber vehicle involved in a fatal accident in Tempe, Arizona, U.S., March 20, 2018. A women was struck and killed by the vehicle on March 18, 2018. National Transportation Safety Board/Handout via Reuters

The fatal accident involving an Uber self-driving car cranks up pressure on the self-driving vehicle industry to prove its software and sensors are safe in the absence of strong government standards, experts in the field said.

Automakers including General Motors Co (GM.N), technology companies such as Alphabet Inc and ride services providers like Uber Technologies Inc have all urged policy makers at the federal and state level not to put a heavy regulatory hand on an industry still in development. They have said their extensive testing demonstrates commitment to safety.

Uber is currently looking for a head of global safety operations who would “drive key strategic programs including Safety Experience and Building Trust,” according to a job posting on the company’s website. The search was posted before the Arizona fatality.

The Uber accident in Tempe, Arizona this week was the first death attributed to a self-driving car operating in autonomous mode. It has given ammunition to critics of the industry concerned that the lack of clear standards allows manufacturers to test faulty or partially developed technology on public streets.

Well before Sunday’s fatal accident, industry executives had begun to confront questions about whether self-driving cars can be trusted. They have opened up about their testing methods without revealing secrets of system designs.

Public disclosure of self-driving car testing data is inconsistent and varies by state. California requires manufacturers to report instances when an autonomous vehicle system disengages. Arizona does not.

“There is no question whatsoever that regulations are coming,” said Doug Mehl, a partner at A.T. Kearney’s automotive practice, based in Detroit. “But right now (automakers), software developers and service providers have an opportunity to shape what those regulations are going to look like.”

Alphabet’s Waymo self-driving car unit has underscored in a report that its autonomous vehicles have now logged 5 million miles in real-world testing, and billions more in computer simulations. GM’s Cruise Automation unit has highlighted its decision to teach its driving system to navigate San Francisco’s congested streets.

Still, Amnon Shashua, head of Intel Corp’s (INTC.O) Mobileye vision systems unit, said the industry must do more. He has called for the self-driving vehicle industry to develop “provable safety assurances”.

“We need to prove that these vehicles are much, much safer than humans,” Shashua told Reuters. “How do you go and guarantee that you have a technology that the probability of a fatality per one hour of driving is 1,000 times better than a human? Nobody talks about that because nobody knows what to do.”

No Federal standards

Most self-driving vehicles are equipped with radar sensors and lidar sensors, which use lasers to detect obstacles around the vehicle. There are no federal standards yet specifying how such systems should work. Congress and federal regulators are still debating how tightly to regulate such systems.

“There should be vision tests for the sensors they are using, both static and dynamic to see how well they work,” said Missy Cummings, a Duke University mechanical engineering professor.

The short video recorded by cameras in the Uber vehicle that struck pedestrian Elaine Herzberg while crossing a street in Tempe, Arizona late Sunday raises questions about whether the Uber system responded better than a human driver, experts said on Wednesday.

Uber has hired human operators to sit in driver’s seats of its autonomous vehicles to intervene if necessary. The video released by Tempe police shows a human operator behind the wheel of the Uber vehicle before the impact.

The operator is seen looking down, away from the street, in the seconds before the vehicle struck Herzberg. She was pushing a bicycle across the street from left lane into the right lane where the Uber vehicle was driving.

“It seems it should have detected her,” Daniel Sperling, director of the Institute for Transportation Studies at University of California Davis told Reuters in an email after viewing the video. “It seems unlikely that a human driver would have done better. We do want AVs to do better than us and the potential exists.”

Americans were wary of autonomous vehicle technology even before Sunday’s fatality.

According to a Reuters/Ipsos opinion poll released in late January, two-thirds of Americans are uncomfortable about the idea of riding in self-driving cars.

“The greater risk for the industry is that if people feel it is unsafe, or the testing is unsafe, you’ll see a real backlash against this technology,” said Matthew Johnson-Roberson, co-director of the University of Michigan Ford Center for Autonomous Vehicles.