Self Driving Cars, Can Machines Learn Morality?

Self Driving Cars – Science Fiction is Becoming Science Fact

While planes have had autopilot for years, they still need a pilot for more complex maneuvers and for take-offs and landing. Driving a car however, is a vastly more complex prospect and a decade ago, the idea of a car that could drive without the assistance of a human was something that belonged in the realms of science fiction and nowhere else.

In recent years however, the idea of truly autonomous cars seems to be edging closer and closer to being reality and many major car manufacturers are investing in the technology which will one day (and probably soon) allow cars to move around without a driver on board. If you rent a car or van from Priory it might one day involve your chosen vehicle delivering itself.

 

The Problem With Self Driving Cars Is All The Humans

In an entirely automated environment, self driving cars would have an easier time. If every vehicle was self driving, each programmed to follow a set of rules and to behave in a predictable manner, we might already have streets filled with autonomous vehicles. Unfortunately, self driving cars must be able to cope with sharing the roads with humans, who can choose not to follow the rules and behave unpredictably.

Humans are also responsible for the programming of self driving cars, which is another fallibility which must be accounted for. What if a self driving car encounters a situation for which it hasn’t been programmed? What will it do? While a human might be able to react appropriately to a new or unexpected circumstance, machines must be programmed to cope properly with every possibility and if something gets missed, the results are unpredictable. Artificial Intelligence is advancing in leaps and bounds and will likely go some way towards bridging the gap between programming and decision making ‘on the fly’, but what if AI is called upon to make a moral choice? If faced with the unavoidable choice of swerving to avoid a pedestrian and crashing into a lamp-post (which would injure the driver), or hitting the pedestrian, thereby hurting the pedestrian but keeping the passenger safe, what would (or should) a car choose?

 

Self Driving Cars and ‘The Trolley Problem’

A well know thought experiment is ‘The Trolley Problem’. A runaway train is heading down the track towards five people and in the scenario you’re standing at a fork in the track, next to a lever which could send the train down another track with only one person on it. The decision you make is almost certainly fatal either way, so which do you choose? Do nothing and five people will die, but pull the lever and you’ll essentially have caused the death of one person, even though you’ve saved five. Things get more complicated when you start factoring in roles for the people. What if the one person was a relative and the five were criminals? And so on.

Programming moral choices such as this into a machine, even an intelligent one, essentially makes the programmer the person standing by the lever. If the machine makes the programmed or learned choice and someone gets hurt in an unavoidable collision, for example, where does the blame lie? Driving, at it’s core, involves making decisions which could (and sometimes does) result in collisions. What is the morality behind letting a machine make a life and death decision unsupervised?

There have already been a few incidents where the programming or the decision making of autonomous cars has led to collisions and sadly deaths and while the incidents were as a result of the cars making bad choices, in many of these cases, it’s been reported that the humans who were tasked with ‘supervising’ weren’t paying sufficient attention to what the car was doing.

 

When Will We See Truly Self Driving Cars On The Roads?

Self driving cars are already amongst us, although they are very few in number and are usually supervised by a human while the technology is being assessed.

In theory, a driver which doesn’t get angry, doesn’t get tired and doesn’t suffer from a number of other uniquely human impairments should be a better driver and we have no doubt that in the coming decades the technology will be perfected and driving a car manually may even become a rarity.

If driverless cars become the norm as they surely one day will, hiring a car from Priory will likely be a very different experience, but you can count on us to provide our usual friendly and personable service, no matter who, or what is doing the driving.