The Driverless Tech Conundrum - Pedestrian’s Death Raises Issues The death of a pedestrian after being hit by a self-driven Uber raises issues. Who’s at fault? What’s being done to keep this from happening again? Get the details in this iDriveSoCal Podcast. ***Transcript*** Recording date – March 21, 2018 Tom Smith: Welcome to iDriveSoCal, the podcast all about mobility from the automotive capital of the United States – Southern California. I’m Tom Smith and this episode is about autonomous driving – but with a somber tone. You’ve probably heard about the recent accident in Tempe, Arizona where a self-driving car struck and killed a pedestrian. It happened around 10-o’clock in the evening this past Sunday when a 49-year-old women reportedly stepped onto a road in what police describe as a shadowy area and not within a crosswalk. She was immediately struck by an Uber vehicle that was in self-driving mode and she later died at the hospital. The accident has already sparked a great deal of news coverage and debate. Before going any further I need to state that my thoughts, and those of the entire iDriveSoCal community, are with the victim’s family. We offer our condolences at this very difficult time. With that said this tragic accident is raising some very serious issues regarding self-driving technologies overall. As we’ve reported, the race to deliver self-driving tech has many entrants and stake-holders. As we’ve also reported, there are groups that want to tap the brakes on the pace at which driverless cars are being tested much less deployed. The details of the accident are still emerging but here’s what we know as of recording this podcast: The woman was walking a bicycle while attempting to cross the road. And again, she was not using a cross-walk in what’s described by police as an area where shadows make it difficult to see. The road itself was 4-lanes – two in each direction. There was a human back-up driver in the Uber vehicle at the time of the accident. The human back-up driver was not in control of the vehicle and did not take control of the vehicle in an attempt to avoid the accident. Neither the pedestrian that was killed nor the human Uber back-up driver were impaired. Police have reviewed video footage of the accident that was captured from the self-driving Uber vehicle. That’s where they described the woman having entered the road from out of the shadows. Here’s what else police have stated: The accident would have been “difficult” to avoid for any driver - machine or human. The accident was not likely Uber’s fault but the decision on any possible charges would come from the Maricopa County Attorney’s office. The final and, I think, key element of this terrible accident is this fact – the vehicle made no attempt to stop or avoid the accident. Here’s why that’s significant. There were not only one or two but three redundancies that could have potentially kept this accident from occurring or at least indicated that there was an attempt avoid it. None of those redundant safety measures appear to have activated. Here are the safety measures I’m talking about: First, of course, you have Uber’s self-driving technology. Some will be quick to blame that for failing. And, I say maybe – but what else did or didn’t happen? Second, you have Uber’s human back-up driver. The same camp that would be quick to blame Uber’s self-driving technology for failing will surely take aim at the human error of the back-up driver. (After all, some reports are already calling him a convicted felon with time spent behind bar...
No transcript available.