Self-driving cars have a big obstacle to success: your brain

Around the time of the recent Amtrack crash, Vox published an interesting piece called Cars kill more people. But there's a good reason train crashes seem scarier. Most of us are fully aware of the statistics: traveling by plane or train is much, much safer than travel by car. While fatal automobile accidents are not that common relative to the sheer volume of cars on the road, they're still far more common than deaths by plane or train — Vox puts it at 7.3 deaths per billion miles for driving, and just .43 and .07 for trains and planes, respectively.
And yet we're more afraid of flying, and there are two reasons. The first is a cognitive bias: the availability heuristic. This is a tendency for us to overestimate the likelihood of events that are easily recalled and/or have a strong emotional component. When a plane crashes and 80 people die, it's big news that can get days or even weeks of coverage; but 80 people die on roadways every day, and we hear almost nothing about it. 

The other reason is that we're more afraid of something when we don't have control. From a study cited in the Vox article:
Similar to the voluntary aspect, risks perceived to be under one`s own control are more
acceptable than risks perceived to be controlled by others. Under normal conditions we are unwilling to enter “out of control” situations because we lack security under such
circumstances. We have the impression that as long as we maintain control we can – at least partially – remedy that evil. Being unable to gain control of a situation creates a feeling of powerlessness and helplessness: the individual suffers risk!
What does this have to do with self-driving cars? Well, everything. Google and other companies are going to have their work cut out for them to convince people to hand their cars over to a computer at highway speeds. Even if the crash statistics are well on the side of the computer-controlled cars, the human brain is wired to think that they're better off when they're in control; there has to be an active process of re-conditioning.
Just.... no.
JUST.... NO.
The other problem is that since self-driving cars are an emerging scientific frontier with hugely disruptive potential, any crash is going to be big news. Google cars have been involved in 11 crashes, but none of them were the fault of the cars. It's a safe bet that a computer is ultimately going to be a much better driver than you or I. But, if it happens, that .0001% of the time that the computer miscalculates will do much more harm to the autonomous cars' marketability than any human-caused crash.

Of course, the really big obstacle for self-driving cars will be the simple fact that Americans love to drive. Sure, we may see Google cars replacing taxis or (hopefully) becoming a safe means of transportation for the elderly, but most of us enjoy being behind the wheel and driving is not a pastime we'll be giving up anytime soon.


Popular posts from this blog

Why Christianity is bullshit, part 1: The Bible is stupid

Why Christianity is bullshit, part 2: The Bible isn't true

There is no such thing as sophisticated theology