I, Auto

If you’re a fan of classic science fiction, you may be familiar with Asimov’s Three Laws of Robotics. If not, here they are:

The First Law

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

The Second Law

A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

The Third Law

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

You may be asking, “What the hell does this have to do with cars?

Well, I’m seeing more and more articles about self-driving cars… the stuff right out of sci-fi. Right? One local law firm is even running television commercials saying how they are developing winning strategies now to win the lawsuits of the future when self-driving cars go rouge. So, let’s replace a few words in each law and explore how well autonomous vehicles (AVs) rate against Mr. Asimov’s laws.

A self-driving car may not injure a driver or, through inaction, allow a driver to come to harm.

Sounds great, eh? After all, we’re only human. We get distracted and make mistakes. AVs are packed with sophisticated sensors and electronics that can detect objects and calculate speed and distance. Self-driving cars are designed to keep the driver from coming to harm by eliminating “human error.”

Unfortunately, that technology just isn’t there yet. Accidents involving self-driving cars have resulted in injuries and deaths of drivers, passengers, and pedestrians. Some of those accidents were a direct result of the “driver” relying too much on the AVs technology to do the driving. In one instance, an Apple exec was killed in his self-driving Tesla. Instead of being the failsafe with hands on the wheel and eyes on the road, he was focused on playing a video game.

So much for the car protecting the driver from harm. At this point in time, self-driving cars are unable to comply with the First Law.

A self-driving car must obey the orders given it by a driver except where such orders would conflict with the First Law.

You jump in your AV, punch in your destination, and hit go. Your self-driving car may be unable to comply. Poor weather conditions can cause problems with the complex electronic systems that are vital to an AV’s ability to navigate.

There’s also the issue of AVs sharing a common network that allows them to recognize each other. Such networks are highly susceptible to hacking. Imagine asking your car to take you to the grocery store and you end up behind an abandoned warehouse or in a dark alley.

Self-driving cars are unable to completely comply with the Second Law either.

A self-driving car must protect its own existence as long as such protection does not conflict with the First or Second Law.

Let’s pretend your self-driving car is rolling down the road at 55 miles-per-hour when, all-of-a-sudden, a person steps out of a disabled car and into your path… just as a tractor-trailer is approaching from the other direction. How does the self-driving car react to a no-win situation like this?

Striking the person in the road would probably kill them. A clear violation of the First Law.

Swerving into the path of the oncoming tractor-trailer would result in a crash that would destroy the self-driving car and could result in the possible deaths of the occupants and the truck driver. A clear violation of all three Laws.

In either case, who’s responsible? The guy who owns the self-driving car? He forked over big bucks for an AV so he didn’t have to drive and make those kinds of decisions. Is the manufacturer at fault? They’re the ones who programmed the car to react a certain way. Maybe it’s the programmer that deserves the blame? No wonder that law firm has started working on their strategies now… it’s gonna’ take years to figure that out who to sue when an AV hurts or kills someone.

That said, I don’t think self-driving cars can comply with the Third Law.

I think Mr. Asimov would agree with me… if you’re going somewhere, keep your hands on the wheel and drive yourself there.

Thoughts? Criticisms? Afraid SkyNet will start making self-driving cars? Let me know.