The death of a pedestrian after being hit by a self-driven Uber raises issues. Who’s at fault? What’s being done to keep this from happening again? Get the details in this iDriveSoCal Podcast.
Recorded March 21, 2018
Tom Smith: Welcome to iDriveSoCal, the podcast all about mobility from the automotive capital of the United States – Southern California.
I’m Tom Smith and this episode is about autonomous driving – but with a somber tone.
You’ve probably heard about the recent accident in Tempe, Arizona where a self-driving car struck and killed a pedestrian.
It happened around 10-o’clock in the evening this past Sunday when a 49-year-old women reportedly stepped onto a road in what police describe as a shadowy area and not within a crosswalk.
She was immediately struck by an Uber vehicle that was in self-driving mode and she later died at the hospital.
The accident has already sparked a great deal of news coverage and debate.
Before going any further I need to state that my thoughts, and those of the entire iDriveSoCal community, are with the victim’s family. We offer our condolences at this very difficult time.
With that said this tragic accident is raising some very serious issues regarding self-driving technologies overall.
As we’ve reported, the race to deliver self-driving tech has many entrants and stake-holders.
As we’ve also reported, there are groups that want to tap the brakes on the pace at which driverless cars are being tested much less deployed.
The details of the accident are still emerging but here’s what we know as of recording this podcast:
- The woman was walking a bicycle while attempting to cross the road.
- And again, she was not using a cross-walk in what’s described by police as an area where shadows make it difficult to see.
- The road itself was 4-lanes – two in each direction.
- There was a human back-up driver in the Uber vehicle at the time of the accident.
- The human back-up driver was not in control of the vehicle and did not take control of the vehicle in an attempt to avoid the accident.
- Neither the pedestrian that was killed nor the human Uber back-up driver were impaired.
- Police have reviewed video footage of the accident that was captured from the self-driving Uber vehicle. That’s where they described the woman having entered the road from out of the shadows. Here’s what else police have stated:
- The accident would have been “difficult” to avoid for any driver – machine or human.
- The accident was not likely Uber’s fault but the decision on any possible charges would come from the Maricopa County Attorney’s office.
The final and, I think, key element of this terrible accident is this fact – the vehicle made no attempt to stop or avoid the accident.
Here’s why that’s significant.
There were not only one or two but three redundancies that could have potentially kept this accident from occurring or at least indicated that there was an attempt avoid it. None of those redundant safety measures appear to have activated.
Here are the safety measures I’m talking about:
First, of course, you have Uber’s self-driving technology. Some will be quick to blame that for failing. And, I say maybe – but what else did or didn’t happen?
Second, you have Uber’s human back-up driver. The same camp that would be quick to blame Uber’s self-driving technology for failing will surely take aim at the human error of the back-up driver. (After all, some reports are already calling him a convicted felon with time spent behind bars. That issue aside.) And again, I say maybe – but what else did or didn’t happen.
Third, the vehicle itself was a Volvo SUV that is known to have collision avoidance technology. Now maybe this one didn’t have it or maybe it did and it was disabled for some reason – possibly because it interfered with the self-driving tech Uber was running within the vehicle.
My point is that those are three – admittedly not factual, but very likely – safety redundancies that didn’t engage at all.
Why? Because there simply wasn’t time. Because this was an accident that very well may have been – and this is another term police are quoted as using – it may have been quote, “unavoidable,” un-quote.
Now, it’s terrible and tragic for sure. And again, our condolences go out to the pedestrian’s family.
Uber has voluntarily suspended its self-driving testing.
And somewhat surprisingly, Toyota has voluntarily suspended its self-driving testing. They say their reason for the suspension is the emotional effect this recent accident may have on its own drivers involved in testing their self-driving tech. (BTW – Toyota isn’t even testing in the state of Arizona, which again, is where the accident occurred.)
Also, Consumer Watchdog, who we did a podcast with about this very topic just a few weeks ago is calling for a nationwide moratorium on the testing of self-driving technologies.
This is a tough issue for sure. It’s not the first death involving self-driving technology. That already happened in Florida a few years ago when a driver died while using the auto pilot feature on his Tesla.
But this is the first pedestrian death involving self-driving tech.
What’s ironic about this accident, regardless of who or what is eventually determined to be at fault… Is that the technology being questioned as the cause of the accident. Is the same technology being worked on to eventually minimize and entirely avoid these same types of accidents from ever happening in the first place.
Both the NTSB, the National Transportation Safety Board, and NHTSA, the National Highway Traffic Safety Administration are investigating.
We’re sure to hear more about this topic as our future of mobility continues to evolve.
For iDriveSoCal, I’m Tom Smith, thanks for listening.