Fatal Accident Raises Questions About The Safety Of Testing Self-Driving Cars

“In some ways we are putting a greater burden on the autonomous technology than we do on human drivers.”

By Michael MarksMarch 20, 2018 11:40 am

Late Sunday night in Tempe, Arizona, a pedestrian died after being struck and killed by a vehicle. This in of itself is tragic but not unusual. In 2015 over 5,000 pedestrians were killed in traffic crashes. But this one was different because the car in Tempe was driving itself. It was an automated vehicle owned by Uber.

While there was a person in the driver’s seat – presumably as backup – they were not driving the car at the time of the crash. This is believed to be the first fatal crash involving a self-driving car. Uber has since suspended its tests of the technology in Tempe, as well as similar projects in Pittsburgh, San Francisco, and Toronto.

Dr. Johanna Zmud, head of the planning division at the Texas A&M Transportation Institute, says that the incident emphasizes the care that is needed in testing.

“There’s obviously a pressure to commercialize autonomous vehicle technologies,” Zmud says. “But that pressure must be balanced with the need to be completely satisfied through research and testing on closed courses, that these technologies are as foolproof as possible before their widespread introduction to public roadways.”

The crash on Sunday happened in an area with considerable traffic, which raises the question if authorities are allowing testing prematurely on public streets.

“It is necessary eventually for these vehicles to be tested on public roads, but it needs to be done carefully in order to ensure that the technology works,” Zmud says. “We don’t know to what extent their technologies are fully mature.”

Despite the incident, Zmud trusts that the autonomous vehicles, if developed to its full extent and fully tested, will be safer than human drivers.

“This is one fatality on an average day. We still have human drivers out there that are making decisions that benefit society and other decisions that don’t,” she says. “In some ways we are putting a greater burden on the autonomous technology than we do on human drivers. The question that we ask ourselves is ‘is the technology safer than a human driver?’ I don’t expect it to be a 100 percent safe, but I do expect it to be safer than a human driver.”

Zmud says the accident will likely reinforce the mindset of those who are highly concerned about autonomous vehicle safety.

“But those people in the general public – and right now it is a little more than half – who intend to use self driving vehicles, they will see this as an incident,” she says, “but it won’t affect their long term acceptance of the vehicle technology.”

Written by Cesar Lopez-Linares.