Getting hit by a self-driving car hurts more

 

You’ll surely be blamed for getting hit. Worse than the opprobrium that will come if you ever get hit by a car and you’re not wearing a helmet.

That’s what happened when Uber’s self-driving car killed Elaine Herzberg in Tempe, Arizona the other week. Of course, she was blamed from coming out of nowhere—specifically “from the shadows.”

Only it turns out, both the autonomous car, and the driver who was present for situations just like this, should have seen her. The car might not have been going the speed limit, 38mph in a 45mph zone, a good thing. But it should have sensed her presence and reacted accordingly.

Not only was the professional driver not looking on the road as he was getting paid to do, but Uber had cut back on the number of sensors attached to their (somewhat) autonomous car. Seven lidar sensors on the Ford Fusions they used to use compared to one spinning sensor on the roof of the Volvo SUVs they’ve been using most recently. Uber had also disconnected the Volvo’s standard equipment collision avoidance technology.

Worse still, you, or your family, might not have any legal recourse. It isn’t just that some governments, like that in Arizona, pretty much are letting self-driving car companies do what they want, thus washing their hands of any oversight. It’s telling that Uber had a self-driving car run a red light on the very day it announced the service in San Francisco (Uber claims a driver was at the wheel, but this is Uber), which is part of why the packed up there and moved their autonomous car project to Arizona—the other is that the governor promised no meaningful oversight.

Equally problematic is that without a driver in the car, it’s hard to tell who is responsible. And the companies getting involved, are sufficiently deep-pocketed that they’ll not only lobby their way to minimal oversight and responsibility, but they’ve got the means and the desire to make it seem like any error is on the part of the person who got killed, not them. After all, we’ll be told machines don’t make mistakes.

At best, we can expect a non-denial denial, and a non-apology along the lines of “mistakes were made,” as the best they’ll offer up if the blame appears to be on the autonomous vehicle.

The promise of self-driving cars is so great. Driving is by turn boring and nerve-wracking. Keeping between lines, keeping a safe distance from the car head, avoiding potholes, avoiding people, waiting for an opening to turn left is not fun. Too often, you’re late, traffic is slower than you expected, and the weather isn’t ideal. Good thing there’s a radio. Bad thing a phone is connected to it or sitting close by.

Human drivers have lots of problems, and taking the responsibility of driving out of their hands seems to make sense. Especially with a rapidly-aging population that is highly dependent on personal automobiles to take care of necessities in communities designed around everyone having a car in their driveway, it seems like self-driving cars can’t come fast enough.

On the other hand, we don’t know if self-driving cars are any better at driving safely than people. We assume they can’t be distracted, can keep a safe distance, won’t get anxious waiting to turn left. But thanks to some governments abdicating their oversight responsibility and an industry that doesn’t want to share unmassaged statistics, or any raw data, we have no way of knowing.

It would be great to know that autonomous cars have been programmed to pass cyclists with a three-foot buffer, something that is increasingly becoming law. But we don’t know. And pedestrians and cyclists are two things we know self-driving cars aren’t dealing with well.

The “trolley problem” is a serious ethical issue that has to be grappled with. Namely, how does an autonomous vehicle decide whether to hit the cyclist or injure the people inside the car? But that dilemma is one that can’t even be reached yet, as there isn’t enough information to even know what the options actually are. It could just be “moral luck” that self-driving vehicles appear to be safer at this moment.

For all the promise of self-driving cars, we are overlooking solutions that might be more reachable. Today. Assistive driving technology is pretty amazing. Rearview cameras for reversing, sensors that can detect vehicles in blind spots, warnings when you’re getting too close to the car in front, and more. Taking a step back, smarter road designs can make a real difference in safety as well.

Worst of all, the dangling promise of self-driving cars is taking focus away from the biggest problem of all. The over-reliance on the personal automobile. Improving shared transit and mass transit, and getting people out of the perceived necessity of owning cars that are used by one person and sit empty for most of the time. The technology for changing this has been around for years.

Getting hit by a self-driving car, collateral damage for a data point to possibly better a technology we have no clue how close or far it is from being successfully (safely) realized, and then blamed for it would suck. On the other hand, it’s about what cyclists can expect already.




Image credit: By Grendelkhan – Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=56611386

Share your thoughts.