(Reuters) – Uber Technologies Inc [UBER.UL] on Monday said it has retained a former top U.S. transportation official to advise it on safety after a fatal self-driving crash in March, but it declined to comment on a technology website’s report that a software flaw was responsible for the accident.
The Information reported on Monday that Uber has determined the likely cause of the collision in March that killed a pedestrian was a problem with the software that decides how a self-driving car should react to objects it detects. The outlet said the car’s sensors detected a pedestrian but the software decided the car did not need to react right away.
“We can’t comment on the specifics of the incident,” Uber said regarding the report, citing an ongoing investigation by the National Transportation Safety Board (NTSB).
In the March 18 accident an Uber self-driving vehicle struck and killed a 49-year-old woman who was walking across a street in the Phoenix suburb of Tempe.
Uber, which suspended testing of autonomous vehicles after the accident, on Monday said it was looking at its self-driving program and said it retained Christopher Hart, a former chairman of the NTSB, to advise it on safety.
“We have initiated a top-to-bottom safety review of our self-driving vehicles program, and we have brought on former NTSB Chair Christopher Hart to advise us on our overall safety culture,” Uber said. “Our review is looking at everything from the safety of our system to our training processes for vehicle operators, and we hope to have more to say soon.”
Hart, who was named chairman of the Washington-area Metrorail safety commission in March, did not immediately respond to a request for comment.
In a video of the crash released by police, the Uber vehicle appeared not to brake before it struck the woman. There was a human driver sitting behind the wheel, who in the video appeared to be looking down and not at the road. Just before the video stopped, the driver looked upward toward the road and suddenly looked shocked.
The NTSB is expected to issue a preliminary report on the Arizona Uber crash in the coming weeks, a spokesman said.
The National Highway Traffic Safety Administration (NHTSA) is also investigating the incident and declined to comment.
Bryant Walker Smith, a self-driving car expert and law professor at the University of South Carolina, said in an email that the report by The Information raised the question of whether Uber’s “software might have detected something but misclassified as something other than a human (which could include determining that the probability of that something being a human was low).”
False positives and negatives have long been a challenge for self-driving and semi-autonomous driving systems, he said, but said that detecting a pedestrian crossing a street “doesn’t seem like a edge case” that would have been difficult for a self-driving car to handle.
Uber’s chief executive officer, Dara Khosrowshahi, said in April that Uber still believed in prospects for autonomous transport, saying that “autonomous (vehicles) at maturity will be safer.”
Hart, who was chairman of the NTSB when it opened a probe into a fatal Tesla crash involving a driver using the vehicle’s Autopilot system, in 2016 said that self-driving cars will not be perfect.
“There will be fatal crashes, that’s for sure,” Hart said, but he said that would not derail the move toward driverless cars.
Reporting by David Shepardson; editing by David Gregorio and Leslie Adler