Company’s testing autonomous vehicles and self-driving technologies have one giant fear, a fatality. That fear became a reality last night for Uber as one of its self-driving vehicles killed a pedestrian in Tempe, Arizona.
The incident occurred when the vehicle was in autonomous mode; however, there was a safety engineer behind the wheel, which is standard Uber practice. There were no other passengers in the vehicle at the time.
According to a USA Today report, the woman, who has not yet been named, was crossing outside the designated crosswalk at about 10 p.m. when she was hit, police said.
In a statement, Uber said it was cooperating fully with local authorities. “Our hearts go out to the victim’s family. We are fully cooperating with local authorities in their investigation of this incident,” an Uber spokeswoman, Sarah Abboud, said in a statement.
(Uber expanding autonomous truck testing in Arizona. Click Here for the story.)
Uber CEO Dara Khosrowshahi referenced the “incredibly sad news out of Arizona” in a tweet and confirmed the company is cooperating with local law enforcement.
The company also immediately suspended testing in other locations in the U.S. and Canada, as its policy. Uber tests in Pittsburgh, San Francisco and Toronto.
Sgt. Ronald Elcock, a Tempe police spokesman, told USA Today that the car was in autonomous mode with a driver behind the wheel when it hit the pedestrian. The National Transportation Safety Board was sending a small team of investigators to Arizona to gather information about the Uber crash.
(Click Here to see why taking Uber to work is cheaper than driving yourself.)
“As always we want the facts, but based on what is being reported this is exactly what we have been concerned about and what could happen if you test self-driving vehicles on city streets,” said Jason Levine, executive director of the Center for Auto Safety, a Washington-based advocacy group. “It will set consumer confidence in the technology back years if not decades. We need to slow down.”
Although self-driving cars are ultimately expected to be safer than cars piloted by humans, research to develop these vehicles needs to cross into potentially dangerous territory to make the vehicles foolproof.
Researchers and engineers are working on the technology now to try to get it to that level of efficiency. While no one is comfortable with the idea of possible fatalities during testing, companies are now past that point.
This isn’t Uber’s first accident in Tempe involving an autonomous vehicle. Last year, a Volvo XC90 from Uber was hit by the driver of another vehicle. The Uber was in self-driving mode when it was hit by another driver in an intersection after the vehicle failed to yield. She hit the Volvo XC90 hard enough to flip it onto its side.
(To see more about Uber’s settlement with Waymo, Click Here.)
“We’re within the phase of autonomous vehicles where we’re still learning how good they are. Whenever you release a new technology there’s a whole bunch of unanticipated situations,” said Arun Sundararajan, a professor at New York University’s business school. “Despite the fact that humans are also prone to error, we have as a society many decades of understanding of those errors.”
So predictable. Autonomous vehicle technology is going to make a mint for those in the legal profession. And this tragedy is just the beginning.
I am concerned that cautious programming will lead to unexpected stops, resulting in a human-driven car rear-ending the autonomous car. Legally, the human will be at fault, but the crash is still due to AI.
I’m just another guy who has worked with computers for decades. When we get to the point that software isn’t buggy, components don’t fail, and so on. Then perhaps the automobile industry will offer a real step forward. I believe that currently, money would be better spent on improving driver training and tougher licensing requirements. Too many people behind the wheel that are a menace to others.
I hope this doesn’t slow down the move toward autonomous. Keep in mind:
1) Companies like Uber and FedEx can’t make as much profit if they have to pay for a driver behind the wheel.
2) Millennials can’t be expected to drive when they have important texts to read and write.
3) Millennials will never be as good driving vs. an autonomous machine if they only leave their parent’s basement once a week.
4) Was the pedestrian that got hit wearing appropriate clothing that could reasonably be recognized by the Uber machine (reflective tape, flashing leds, RFID transponder, coat with integral air bag, …)?
5) Was the pedestrian properly trained in interfacing with an autonomous vehicle?
6) S#!$ happens.
Jack, brilliant!
I just saw the Uber video of the accident on WSJ.
https://www.wsj.com/articles/video-shows-final-seconds-before-fatal-uber-self-driving-car-crash-1521673182
It’s pretty obvious the autonomous car had no better vision than a person has. If they had radar or night vision, this pedestrian would have been ‘seen’.