Startup general interest

Driverless cars and rules vs principles

By December 18, 2015 One Comment


The folks at the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab are now debating whether they should teach their self driving cars to commit minor infractions from time to time. These minor infractions would mimic human behaviour to help keep the cars out of trouble, for example by nudging out at busy junctions.

Raj Rajkumar, the lab co-director, recently said:

It’s a constant debate inside our group. And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you.

The rules, as codified in law, say that the speed limit should be obeyed, and that’s what Google, General Motors and other autonomous car companies are coding in their software. Human drivers on the other hand, and to a large extent the law enforcement agencies that control them, operate a more complex system that blends rules and principles. There are speed limits, and clear rules about right of way and so on, but bending the rules in certain well understood cases is expected. Nudging out in front of cars at busy junctions when they have right of way is a good example.

This blend of rules and principles has worked pretty well until now because humans are well equipped to judge when to break the rules and what principles to apply when they do. Also important is that we’ve developed a complex system of minor punishments to guide people to sensible behaviour.

Now we have to code that in software which requires us to be explicit about exactly when it’s acceptable to break the rules.

The complicated piece of that is getting agreement across society where views on these topics vary by geography. I remember when I moved to London that I had to start driving more aggressively to get around town. Aggressive driving is the norm here and if you drive in the way I learned in my home town of Gerrards Cross you surprise other drivers and it ends up being more dangerous. Then when I go back to Gerrards Cross I have to remember adjust my driving style.

Getting national agreement on how and when self driving cars should commit infractions basically requires re-writing the law, or at least agreeing how the law should be re-written. That will be tough to achieve, yet I suspect we won’t see the full potential of self driving cars without it.

More interestingly, this debate on how self driving cars should behave foreshadows a much wider debate about how artificial intelligences should behave. We have this rules vs principles tension all through society, and we currently get by with human judgement and systems of punishment.  Because they are software AIs will require us to be explicit about the trade-offs.