Sep 12 2017

The Safety and Ethics of Self-Driving Cars

Google-SDCGermany just came out with their first regulations for self-driving cars that address how they will be programmed with respect to safety. Specifically – what should the programming do if harm cannot be completely avoided and it has to decide between the lesser of two bad outcomes? Germany is the first country to come out with such regulations, and therefore sets the example for other countries who will likely follow.

Here are the key elements of their decision:

  • Automated and connected driving is an ethical imperative if the systems cause fewer accidents than human drivers (positive balance of risk).
  • Damage to property must take precedence over personal injury. In hazardous situations, the protection of human life must always have top priority.
  • In the event of unavoidable accident situations, any distinction between individuals based on personal features (age, gender, physical or mental constitution) is impermissible.
  • In every driving situation, it must be clearly regulated and apparent who is responsible for the driving task: the human or the computer.
  • It must be documented and stored who is driving (to resolve possible issues of liability, among other things).
  • Drivers must always be able to decide themselves whether their vehicle data are to be forwarded and used (data sovereignty).

This all makes sense to me and I don’t see anything overly controversial. Prioritizing people over property is a no-brainer. Treating all people as of equal value also seems like the right move. This is because you could not individualize such decision – only treat people demographically or as part of a group. This would be too ethically fraught to be practical.

The rules assume for now that “driverless” cars have the ability to drive themselves, but still require a licensed capable driver to take the controls when necessary. In fact, the report discusses the possibility that if the programming encounters a decision dilemma with regard to minimizing death and injury, it may turn control over to the human driver at that time to make the tough decisions.

That was the trickiest part of the recommendations, in my opinion. I certainly wouldn’t want the car to suddenly turn over control in an emergency situation. The driver would be hard pressed to react quickly enough to produce a superior outcome to the programming just doing what it can. The report does say if the driver does nothing it would brake and come to a stop as a default. I suspect we’ll find that the default response of the car is likely to be superior to the outcome of giving the human driver sudden control.

The report also explicitly recognizes that automated cars are safer than human-controlled cars. This is already true, and they will only get safer. After 1.8 million miles of driving, Google’s automated cars were in 13 fender-benders, 100% caused by other drivers.

Further, having automated cars on the road will increase safety for everyone – not just those in the automated cars. The higher the percentage, the safer our roads will be. Germany concluded this creates an ethical imperative for governments to facilitate self-driving cars. Perhaps one day they will be mandatory.

The reasons for this enhanced safety are obvious. Driving safely requires constant attention and the ability to react to a sudden hazard with little warning. Humans are terrible at constant attention. We can become tired, distracted, and lose focus. We can become confused by complex intersections or road signs, or blinded by the sun. We may decide to drive while impaired, either because we are sleepy or inebriated.

Computers do not get distracted, do not lose focus, and are never fatigued. They are simply better drivers than humans. This fact alone is creating great pressure to adopt automated vehicles as quickly as possible, now that the technology is here. It seems like this tech will be like smart phones – in a very short time they will become the norm. With cars, however, a “short time” could be 20 years. This is because people tend to hold onto their cars for a long time – the average age of a car on the road in the US is 11.5 years. So even if all new cars purchased were self-driving, it would take a decade to replace most of the cars on the road.

But there are some areas where self-driving vehicles may take over more quickly. Trucks are one example. Shipping requires driving long distances, a much better job for a computer than a human. There are lots of trucks on the road and if they had the enhanced safety profile of a self-driving car that would make the roads much safer.

Taxis are another example. Uber has already demonstrated that the old taxi model is outdated. The new model of using an app and and algorithm to match driver and passenger is much better. Now make those Uber cars self-driving and the roads may quickly fill with driving as a service. This may significantly decrease the need and motivation for even owning a car for many people.

So even if there are a lot of old traditional cars out there for the next 20-30 years, the roads may disproportionately be occupied with automated vehicles fairly quickly.

This is a good thing not only for safety, but for efficiency. Automated vehicles are also likely to be more energy efficient. They can be programmed to optimize acceleration and braking to minimize energy use. Again this is something that humans are not very good at. Cars that give real-time feedback to the driver about energy use do improve fuel efficiency, but computer algorithms could be optimal.

Further, an integrated system could theoretically reduce travel times. Traffic could be streamlined and optimal routes calculated. Also, with Uber type services, you won’t have to make round trips to drop people off.

Once we get to the point that vehicles are fully automated we won’t need a human driver at all. This could add further efficiency. This would reduce the number of people who need to be in the vehicle, assuming the driver is only providing the driving service (like driving a truck or dropping someone off). That is 150 pounds or so less in the car, which improves fuel efficiency. This would have the added benefit of saving time for all those unneeded drivers.

There really is no practical downside to changing over to self-driving cars. This is a huge win for society. One potential subjective downside is the perceived loss of freedom and the fun of driving. That factor may linger for a generation or two, but eventually driving a car will likely become like riding a horse – a pastime or sport rather than a necessity.

The German report mentions the vulnerability of hacking. This is clearly a risk. They recommend that measures be taken to make automated vehicles and their support systems as secure from hacking as possible. This is likely going to be the biggest issue of safety, and we should definitely be spending a lot of resources figuring out how to bake in security from hacking right from the beginning. All such cars should, at the very least, have a fully manual override. The human driver should always be able to take full control in a way that is hack-proof.

But that is not enough. Self-driving cars will need extreme security from hacking. To me that is the only big question mark – how secure can we make them?

Germany’s new regulations are a step in the right direction, and hopefully will motivate other countries to follow suit. This is going to be a relatively rapid technological change, and it’s best that governments get out in front of it thoughtfully.

54 responses so far