Obeying the law can sometimes get you killed. Humans – those not asleep at the proverbial wheel – know this.

Self-driving cars don’t.

They are programmed to be obey every law, all the time – regardless of circumstances. This is creating problems.

Potentially, fatalities.

Example: Up ahead, there’s a red light. Your autonomous Google car is coming to a stop because its sensors can tell the light’s red. But your Google car hasn’t got a brain, so it can’t override its Prime Directive – obey the red – in order to deal with the big rig coming up behind you that’s locked up its brakes and is clearly going to crush you to death in about three seconds if you don’t run the red light and get out of the truck’s way.

You’d mash the accelerator pedal, blow the light. But the Google car won’t. That would be illegal.

So now, you’re dead.

Or, you’re trying to make your way home in a blizzard. If it’s you controlling the car, you know that coming to a full stop for a stop sign at the crest of a steep hill is probably going to result in your car sliding back down the hill and into the cars behind you.

So, you ‘California Stop’ the sign. It’s technically illegal – but it’s the right thing to do, in order to not lose momentum – and to avoid losing control.

The Google car would stop. And you’d roll back down the hill.

Evasive/emergency maneuvers are almost always technically illegal. But they are often the only way to avoid an accident.

Humans can process this – and are capable of choosing the lesser of two evils. A driverless car cannot. It only knows what the sign (and law) says and is programmed to obey as doggedly as Arnold’s T800 in the Terminator movies.

Nuance is not yet a machine thing.

And that’s a real problem, not a hypothetical one. Prototype driverless cars that are in circulation have twice the crash rate of cars with human drivers, according to a just-released study by the University of Michigan’s Transportation Research Institute (see here).

Apparently, Bobo (human drivers) not so stupid after all.

It’s not that autonomous cars are stupid. It’s that they lack the uniquely (so far) human attribute of judgment. They cannot weigh alternatives. It is either – or. Black – or white. Parameters are programmed in – and the computer executes those parameters without deviation.

Because that’s what it was programmed to do.

Human drivers, on the other hand, can foresee the consequences of a developing situation and take action based on intangibles no computer can (yet) grok. Humans know that most traffic laws are, as they teach in law school, malum prohibitum (i.e., technical fouls, violations of a statute, certainly, but not moral violations) rather than malum in se (morally wrong, like stealing things).

Computers cannot appreciate the distinction. They defer to the law – even if it means that eighteen wheeler bearing down on you isn’t going to stop, regardless of the law about running red lights.

Humans also know when a law is ridiculous – and (provided no cop is around) will ignore it outright. And here we come to a possibly happy unintended consequence.

Autonomous cars may end up highlighting the ridiculousness of certain traffic laws; most posted speed limits, for instance. By obeying them. The old man in a Buick will be replaced by the autonomous Corvette doing exactly 65 with everyone else running 70-75 (at least).

The machine Mind Cloverized Corvette will never move over.

He – it – is “doing the limit,” after all.

There are only a few old men in Buicks out on the highway. But there could be millions of autonomous cars. All programmed to do the speed limit – no matter how dumbed-down and preposterous for conditions, road… or car.

How about right on red? Forget it!

Even if it’s obviously clear – and safe – to proceed. The law is the law.

Merging and yielding? Better leave ten minutes early.

If a deer runs in front of the car, will the autonomous car swerve briefly (and illegally) into the other lane – and break the law forbidding crossing over the double yellow – in order to avoid the deer?

Probably not.

So, you wreck the car.

And if there’s a wreck, who gets the blame … and the bill? If the human inside is just a passenger, it’s hard to write him a ticket (or sue him for damages). But computers don’t care about DMV “points,” you can’t send them to driving school and they haven’t got any wages that can be garnished to pay your medical bills.

Ironically, these autonomously driven vehicles were touted as being more competently driven than cars driven by humans. One of the claimed benefits being that we’ll be able to get where we’re going going faster. But unless speed limits are raised dramatically – to reflect the speeds people are already driving, the law be damned – it’ll take us longer to get where we’re headed.

No more hammer time. Instead, a conga line of self-driving cars driving extra, extra cautiously – at the “safe” pace of the least common denominator. Which is what almost all traffic laws presume.

Imagine your car controlled by your fearful, hesitant and rigidly law-abiding mother-in-law. That’s the Autonomous Future looming in the rearview.

Oh, we’re gonna have some fun!

Related Articles