SAN FRANCISCO: Ride-sharing giant Uber said on Monday (Mar 19) it is suspending its self-driving car programme after one of the vehicles struck and killed a pedestrian in the US state of Arizona.
Elaine Herzberg, 49, was walking her bicycle outside the crosswalk on a four-lane road in the Phoenix suburb of Tempe on Sunday when she was struck by the Uber vehicle travelling at about 65km per hour, police said. The car was in autonomous mode with an operator behind the wheel.
Herzberg later died from her injuries in a hospital, police said.
"The pedestrian was outside of the crosswalk. As soon as she walked into the lane of traffic she was struck," Tempe Police Sergeant Ronald Elcock told reporters at a news conference. He said he did not yet know how close Herzberg was to the vehicle when she stepped into the lane.
Elcock said he believed Herzberg may have been homeless.
Volvo, the Swedish car brand owned by China's Geely, confirmed its XC90 SUV was involved in the crash but said the software controlling the car was not its own.
"Our hearts go out to the victim's family," an Uber spokesperson told AFP. "We are fully cooperating with local authorities in their investigation of this incident."
Uber said it had temporarily halted its use of self-driving cars for testing or customer rides in Tempe, Pittsburgh, Toronto, and San Francisco.
Tempe is one of just two cities - along with Pittsburgh - where the ride-sharing firm has been using autonomous vehicles as part of its regular passenger fleet.
The vehicle operator in the driver's seat was the only person in the car when the accident occurred, Uber said. The car was in police hands on Monday.
Sunday's accident was the first fatal self-driving car crash involving a pedestrian.
The first deadly self-driving car accident was reported in mid-2016, and involved a Tesla.
The Tesla Model S, cruising on "Autopilot," failed to detect a crossing tractor-trailer against a bright sky, killing the driver - who it later emerged had kept his hands off the wheel for extended periods of time despite automated warnings not to do so.
Investigators at the US National Transportation Safety Board determined the probable cause of the Tesla crash was the combination of "a truck driver's failure to yield the right of way and a car driver's inattention due to overreliance on vehicle automation."
Autonomous-vehicle technology has been touted as having potential to save fuel, ease congestion, and to save thousands of lives by avoiding accidents due to human error.
As with the fatal Tesla crash, however, the deadly Uber accident is likely to stoke concerns that the industry is moving too fast.
Google-owned Waymo this month began using its self-driving trucks to haul cargo bound for the internet giant's data centers in Georgia, while rival Uber announced the use of self-driving semi trucks as part of an on-demand trucking service in Arizona.
In September, US Transportation Secretary Elaine Chao released new guidelines that permit more testing of self-driving cars.
But America's non-profit Consumer Watchdog has warned that roads are being turned "into private laboratories for robot cars with no regard for our safety."
The group on Monday called for a nationwide moratorium on testing self-driving cars on public roads while investigators figure out what went wrong in the Uber accident.
"Arizona has been the wild west of robot car testing with virtually no regulations in place," Watchdog technology project director John Simpson said in a statement.
"When there's no sheriff in town, people get killed."
CAR VISION TESTS?
US states set their own rules for roads, and a handful have passed laws allowing self-driving vehicles.
California and Arizona have been particularly encouraging, hoping that companies developing autonomous technology in those states will create local jobs and facilities devoted to a promising new industry.
Duke University robotics professor Missy Cummings is among the advocates of slowing down introduction of autonomous vehicles to avoid risk and get proper regulations in place.
While machines are better at staying vigilant and reacting to routine situations, human drivers have proven superior at handling unusual or unexpected situations, according to the professor.
Cummings reasoned that if people need to pass vision exams in order to get driving licenses, so should self-driving cars.
She noted a case in which putting stickers on a stop sign could fool autonomous car sensors into seeing it as a sign indicating a speed limit.
"If we are still learning at this rate, and still uncovering major problems, it begs the question of why we are trying to put this technology into widespread use," Cummings told AFP.
"I am a big fan of the technology, but it is very unproven and experimental."