The Not-So-Safe Self-Driving Car

Obeying the law can sometimes get you killed. Humans – those not asleep at the proverbial wheel – know this.

Self-driving cars don’t.

They are programmed to be obey every law, all the time – regardless of circumstances. This is creating problems.

Potentially, fatalities.

Example: Up ahead, there’s a red light. Your autonomous Google car is coming to a stop because its sensors can tell the light’s red. But your Google car hasn’t got a brain, so it can’t override its Prime Directive to deal with the big rig coming up behind you that’s locked up its brakes and is clearly going to crush you to death in about three seconds if you don’t run the red light and get out of the truck’s way.

You’d mash the accelerator pedal, blow the light. But the Google car won’t let you. That would be illegal.

So now, you’re dead.

Or, you’re trying to make your way home in a blizzard. If it’s you controlling the car, you know that coming to a full stop for a stop sign at the crest of a steep hill is probably going to result in your car sliding back down the hill and into the cars behind you.

So, you California Stop the sign. It’s technically illegal – but it’s the right thing to do, in order to not lose momentum – and to avoid losing control.

The Google car would stop. And you’d roll back down the hill.

Evasive/emergency maneuvers are almost always technically illegal. But they are often the only way to avoid an accident.

Humans can process this – and are capable of choosing the lesser of two evils. A driverless car cannot. It only knows what the sign (and law) says and is programmed to obey as doggedly as Arnold’s T800 in the Terminator movies.

Nuance is not yet a machine thing.

And that’s a real problem, not a hypothetical one. Prototype driverless cars that are in circulation have twice the crash rate of cars with human drivers, according to a just-released study by the University of Michigan’s Transportation Research Institute (see here).

Apparently, Bobo (human drivers) not so stupid after all.

It’s not that autonomous cars are stupid. It’s that they lack the uniquely (so far) human attribute of judgment. They cannot weigh alternatives. It is either – or. Black – or white. Parameters are programmed in – and the computer executes those parameters without deviation.

Because that’s what it was programmed to do.

Read the Whole Article