Image 01 Image 03

The moral dilemmas of programming the self-driving car

The moral dilemmas of programming the self-driving car

Do you want to surrender to their decisions?

The moment I heard they were perfecting the self-driving car, it gave me very serious pause.

Maybe that’s because in some essential way I don’t trust handing over the decision-making process to a machine, even though I don’t like driving all that much and even though the evidence is that self-driving cars would almost certainly result in fewer accidents and fewer deaths overall. There’s just something very basic about the thechnology that I don’t trust, and it may be the very same very basic thing in me that makes me especially concerned with protecting liberty and autonomy.

But I hadn’t spent all that much time thinking about the details. It turns out others have—they must, if they’re going to program these cars. And it’s no surprise that there are some knotty ethical problems involved.

Here’s one hypothetical:

Picture the scene: You’re in a self-driving car and, after turning a corner, find that you are on course for an unavoidable collision with a group of 10 people in the road with walls on either side. Should the car swerve to the side into the wall, likely seriously injuring or killing you, its sole occupant, and saving the group? Or should it make every attempt to stop, knowing full well it will hit the group of people while keeping you safe?

This is a moral and ethical dilemma that a team of researchers have discussed in a new paper published in Arxiv, led by Jean-Francois Bonnefon from the Toulouse School of Economics. They note that some accidents like this are inevitable with the rise in self-driving cars – and what the cars are programmed to do in these situations could play a huge role in public adoption of the technology.

“It is a formidable challenge to define the algorithms that will guide AVs [Autonomous Vehicles] confronted with such moral dilemmas,” the researchers wrote. “We argue to achieve these objectives, manufacturers and regulators will need psychologists to apply the methods of experimental ethics to situations involving AVs and unavoidable harm.”

Psychologists? Don’t bet on it. They’re no more equipped to make this decision than the average person. In fact, that’s the point: any one-size-fits-all solution is a surrender of individual autonomy and responsibility to a nameless faceless algorithm that decides whether you live or die. There is no formula available for morality.

Of course, we all would have to make a split-second individual decision if (heaven forbid) we were faced with that hypothetical dilemma (“do I save myself or others?”) in a car we were driving. It is not clear what the “right” decision would be, but I think the individual should be the one to make it.

There’s a fascinating conversation on the subject going on in the comments section here, and I suggest you take a look. Some commenters are saying (rather convincingly, I believe) that the situation posited by the ethicists in the above example would actually be handled in a different and better way by self-driving cars, which would prevent it from occurring in the first place because the car would have sensed the problem in advance and slowed itself down already.

I want to add that my distrust of highly automated systems remains. Perhaps it’s irrational, but the surrender of autonomy feels dangerous to me in a different way. What are we sacrificing for a predicted increase in physical safety, and is it worth it?

[Neo-neocon is a writer with degrees in law and family therapy, who blogs at neo-neocon.]

DONATE

Donations tax deductible
to the full extent allowed by law.

Tags:

Comments

2nd Ammendment Mother | October 29, 2015 at 5:38 pm

The vast majority of American roadways are actually rural, not urban.Out here I’m less worried about the herd of people than I am a herd of hogs or deer. Our bar ditches tend to be deep, in rainy weather they are often filled with water or snow drifts….. I just don’t see how an autonomous driver system accounts for all the variables that are possible.

There are lots more issues. If this becomes universally available, the number of vehicles and demand will jump, creating more traffic and pollution – which invites more regulation, and we all know how painful that can be.

Also, success will depend on up to the second mapping accuracy: newer, obscure routes and roads under repair, flood, or damage will cause huge problems so the drone-cars will have to stick to more well-traveled roads where the information is up to date. This limits personal choice as to destination and route.

And if it just keeps to more main roads, the driverless cars have little to offer over a bus or streetcar, except again by putting multiple units on the road, increasing traffic.

    I read an article that said auto industry insiders expect this technology will result in a 75% reduction in the number of cars being sold.

    They think that the self-driving cars will merge with technologies like Uber and people in the cities and suburban areas will simply give up purchasing a car and instead opt for some type of service where they “rent” one when they need it.

    Estragon | October 29, 2015 at 5:40 pm

    There are lots more issues. If this becomes universally available, the number of vehicles and demand will jump,
    ——————
    Not sure that will be true. I somehow think our “leaders” will come up with public transportation available upon a call, email or text. The autonomous vehicle could be sent to pick one up and deliver them to their destination.
    Rural areas with a small population might be left out but I’d guess at a certain population level our “leaders” would restrict access to personal vehicles.
    Face it, lots of people go to the same grocery stores, department stores, go to work along the same road at approximately the same times.
    An airport style mini bus in quantities sufficient for the population could be implemented on the grounds of lessening traffic congestion and reducing pollution (not to mention tracking of the population). And what politician (or corporation) wouldn’t like to know every detail of every person in that town.

      Insufficiently Sensitive in reply to 4fun. | October 29, 2015 at 8:12 pm

      I somehow think our “leaders” will come up with public transportation available upon a call, email or text.

      That indicates a hope or tolerance for an authoritarian government, if not fascist. It puts all too much faith in limiting citizen mobility by micromanagement of Great Leaders.

      Rural areas with a small population might be left out but I’d guess at a certain population level our “leaders” would restrict access to personal vehicles.

      Do consult with the elected blue-state ‘leaders’ of Seattle, who are moving step by step in exactly that direction. Currently engaged in traffic ‘improvements’ that force more congestion on auto traffic, they’re driving in the direction of ‘getting people out of their cars’ – an explicit goal which our leftist political parties began chanting in the 1980s.

      Paul in reply to 4fun. | October 29, 2015 at 8:16 pm

      Why would such a system have to be provided by government? We already see numerous car rental services and things like Uber and Lyft evolving. Merge them with this AV technology and you’ve got a really, really nice transportation solution.

      This was one of the major reasons why I was vocally opposed to a “light rail” measure in Austin last year. Ludicrous for government to be spending billions on a train when in just a few years we’ll have fleets of self driving cars to take everyone where they need to go.

      Andy if you don’t want to use them, then buy your own. Or share one with your friends and family. No need for government involvement.

      In fact, government monopolies (can you say “Crony Capitalism”) like taxi companies and wasteful big government programs like buses and trains could be a thing of the past… it will be MUCH cheaper to simply subsidize ride costs for those truly in need of assistance.

    Barry in reply to Estragon. | October 29, 2015 at 8:11 pm

    “There are lots more issues. If this becomes universally available, the number of vehicles and demand will jump, creating more traffic and pollution – which invites more regulation, and we all know how painful that can be.”

    Actually, self driving cars will reduce the # of vehicles. Imagine, you will not need to own a car, just a schedule. The car zips up, you get in, and off you go. When ready to return you schedule another. Meanwhile the car is getting used to ferry others on a different schedule. Think taxis without drivers. Regulation?, yes maybe more.

    “Also, success will depend on up to the second mapping accuracy: newer, obscure routes and roads under repair, flood, or damage will cause huge problems so the drone-cars will have to stick to more well-traveled roads where the information is up to date. This limits personal choice as to destination and route.”

    Real time information on routing will be employed to take you around obstacles and traffic. Up to second mapping accuracy is certainly not being processed by humans but can be computer driven systems.

    “And if it just keeps to more main roads, the driverless cars have little to offer over a bus or streetcar, except again by putting multiple units on the road, increasing traffic”

    Wrong completely.

A far simpler question seems to be the devil in the details here. If your highway has a speed limit of 60, and every other car is doing 70, will your automatic driving car do 60 or 70? How about when you come to one of the signs that shows a crossroad that normally is marked 50 or so, but nobody pays any attention to the sign and just keeps going. Are you going to get rear ended by a 70mph car when robocar pops on the brake for aparently no reason?

As for the “Car detects multiple humanoids in the road” question: Any programmer worth a dime will simply have the car brake as hard as it can to reduce the end velocity of the impact. There *might* be real humans on the road, or there might not, but the car *does* have humans in it, and if your car runs into a bridge abutment because it saw a couple of deer or a blowing newspaper, bad things will happen to the car company. Swerving is *worse* in that the thing you are headed towards may swerve too, and normally right into you. (It’s deer season here. Every single day, I pass a busted vehicle next to a large red splotch. They’re like bumper-guided missiles.)

The standards for such programming and the answers to such questions will undoubtedly be set by government. And it will be a beautiful thing when the government directs the programming based on social justice concerns. If car is driven by a white person and the crowd in the street is a bunch of Ferguson Missouri social justice warriors burning a looting, then speed up and crash into wall. If car is driven by a black person and white pedestrians crossing the street. Head for the pedestrians.

As for speed limits. Cars owned by whites will have to go mandatory 10 mph under the speed limit. Cars driven by Asians will go 15 mph under the speed limit and then periodically switch into reverse for random periods of time. Cars driven by african americans will go up to 10 mph over the speed limit and Hispanics’ cars will go the speed limit.

Cars driven by republicans will mysteriously stop working at random days and times, but always on election days.

Yes the self driving car will usher in a utopia of social justice for all. Can’t wait !!!!

On a more serious note, I would never trust my life or the well being of my family to any computer nerd programmer or the government and especially not to a combination of computer nerd programmers and the government. Just imagine the Obamacare website teams of government and contractors making life a death decisions for you and your family.

    Why does everyone keep saying this is going to be a government thing? The entities making major strides here are private companies… yes there has been some DARPA funding behind some of the bleeding edge stuff, but so what? That’s true of many now-common technologies in our private lives.

    As far as entrusting your lives to nerd programmers, you do it all the time already. Airline auto-pilots, SCADA systems that run water systems and power grids, robotic surgery, and the list goes on.

      You aren’t thinking far eno7gh into the future. Not government invented but government regulated to the point the government calls the tune on these ethical desidn questions.

        Yeah, we’re going to be faced with a lot of ethical questions related to technology in the future. There are plenty already today…. the debate around vaccinations being one example. But I don’t think that should prevent us from moving the technology forward. We must continue to fight to shrink government and get it out of areas it doesn’t belong.

        There are ways to address these ethical challenges without government sticking it’s fat nose into everything. The more that information “is free” the more the community can enforce things on it’s own. Government does a notoriously crappy job (of just about everything it tries to do). And as the “social justice” cancer further infects the government body, it will only get worse and worse (as you noted with your lawyerly attempt at humor above).

When I’m on my ranch there is no way I’m letting the computer drive. In fact I don’t want any computer at all in my ranch truck (no computerized fuel injection, etc).

But when I’m in the city, or driving to Houston or Dallas, I would very much like punch in my destination and sit back and nap, or work, or read, or sip a G&T.

    Anchovy in reply to Paul. | October 29, 2015 at 8:30 pm

    Yeah…. there are endless highways that I would love to have the ability to kick back and just relax. Kingman, AZ to Mojave, CA comes to mind. When the only exciting thing on the whole road is either Barstow or Kramer Junction, it defines a boring road.

MaggotAtBroadAndWall | October 29, 2015 at 6:51 pm

I remember a similar moral dilemma from an ethics class. The case of the runaway train.

The train is barreling down the tracks and can’t be stopped. The engineer radios ahead to you, the switchman. If you allow the train to continue down the tracks on it’s current path you know it will crash into five workers and kill them instantly. However, you can activate the switch and cause the train to go down a side track, in which case only one worker will be killed. What do you do? Most people say it is more ethical to pull the switch so the train will kill only the one person on the side track.

Then you play around with the scenario. What if the five workers are really convicted murderers on death row – who have been let out of the penitentiary on super strict supervision to work on the railroad tracks for the day? Do you still pull the switch to sacrifice one innocent life for the lives of five men already condemned to die?

Or what if the five workers are complete strangers to you, but you know the worker on the side track very well. Maybe he is your brother or father or someone you care about. Do you pull the switch and sacrifice the life of your loved one so the five strangers can live? Or do you let the train kill the five strangers so the person you care deeply about lives? If you pull the switch only one family – YOURS – experiences pain, grief and sorrow from the loss of a loved one. But if you do not pull the switch, then five families will suffer.

Suppose you are a surgeon with five patients who are certain to die in the next week unless they each get a different organ transplanted. There are no spare organs available to save their lives. So you know with absolute certainty they will all be dead within a week. In walks a homeless vagrant with no family ties. He is young and in good health. It’s a magical world, so you can quickly determine that his organs are perfect matches for your five patients about to die. Nobody will miss the vagrant if you choose to kill him and transplant his organs into the five people. They will go on to live happy, productive, loving lives. And nobody will ever know you killed the vagrant to harvest his organs, so you will not be exposed to legal liability. The only thing stopping you is your conscience. Is it morally right for you to sacrifice the life of one healthy vagrant man so five productive people with families facing certain death can live?

Technology isn’t moral and the self driving car will never have to make an ethical decision like those in the various “runaway train” scenarios. We’d be outsourcing the ethical and moral decisions to computer programmers. Have you seen some of those guys?

Not sure I want to live in that kind of world.

Henry Hawkins | October 29, 2015 at 7:12 pm

Would somebody tell Amy Miller she left the comments closed again on the Paul Ryan, new Speaker post?

Henry Hawkins | October 29, 2015 at 7:13 pm

RE: Self-driving cars, I don’t want my last thought to be, “well, my family will have a sweet lawsuit…”.

    Bingo, Henry. Who’s going to insure the robot, anyhow? I’ll bet that big Google will persuade big gov’t to cover it. Your family will get a nice letter from Obama.

    I’m an engineer who knows quite a bit about computers, algorithms and pattern recognition. What I’ve learned is that you’ll never get the thing to behave like a person, so you can’t integrate them with traffic. Maybe some slow mover that stays in the right lane on city streets could work, but even then it’s bound to hit someone at some point. Then we’ll hear that you have to break a few eggs to make an omelette.

    I’ve found that younger engineers are more likely to be thrilled with the idea. All I can see is risk and annoyance. Why should I have to retrain myself after spending 20 years learning how to drive in Boston? And like you, I want to know who’s going to cover it.

    Next they’ll propose replacing air-traffic controllers and pilots. That would be the end of air travel. Maybe they’ll replace my wife … please! (Apologies to Henny).

      Paul in reply to JerryB. | October 29, 2015 at 8:24 pm

      They’ve already logged millions of miles in real-world traffic. They’ve been doing it for a few years in California and Las Vegas. They recently started running them around Austin. No injuries yet.

        Henry Hawkins in reply to Paul. | October 29, 2015 at 9:52 pm

        “In other words, self-driving cars were five times as likely to crash as conventional ones, and their passengers were four times as likely to get injured (with 3.29 injuries per million miles, compared to only .77 injuries in regular cars). Self-driving cars were also rear-ended 50% more often than traditional vehicles.”

        http://fortune.com/2015/10/29/self-driving-cars-crash/

          Interesting article, but fender-benders all. Have you seen one of these self driving cars yet? When you get a chance, take a look at everybody around them… everybody is driving by staring, trying to take cellphone photos and generally rubber-necking like nobody’s business. No wonder they keep getting run into.

          Oh, Henry. Don’t bother with facts. It’s the goal that’s important, not property damage or injuries.

Conservative0317 | October 29, 2015 at 7:16 pm

Pretty much everything else has been covered in the comments, except: Have they programmed for the Kobayashi Maru?

Q. What is the most useless thing you can think of?
A. A self driving Ferrari.

Insufficiently Sensitive | October 29, 2015 at 8:03 pm

You’re in a self-driving car and, after turning a corner, find that you are on course for an unavoidable collision with a group of 10 people in the road with walls on either side.

This situation means that your driver, your ‘algorithm’, took you into a blind curve going too fast. Fire that programmer – tight curves without further visibility mandate slowing before turning.

I’m more concerned with such things as: system crash(gives real meaning to the term Blue Screen of Death), malware and actually hijacking the car.

    Paul in reply to genes. | October 29, 2015 at 8:26 pm

    Yeah, I wouldn’t recommend Windows OS for your car.

    The hacking concern is real, but not limited to these AV systems. There was a great article in Wired a few months ago about some guys who hacked a new Jeep Grand Cherokee… they could do dangerous stuff lake jam on the brakes or floor the accelerator if I remember correctly. Cars are already highly computerized… they need to be hardened with or without self-driving functionality.

“You’re in a self-driving car and, after turning a corner, find that you are on course for an unavoidable collision with a group of 10 people in the road with walls on either side.”

LOL, how many drivers do you think will make the correct choice here? Most would lock up the brakes and skid into those people. Very few drivers are trained to respond to situations like this.

Self driving cars will reduce the number of accidents and highway deaths.

Trucks and taxis will be driverless in 10 years or less. They will operate 24/7/365 with stops only for programmed maintenance.

    Paul in reply to Barry. | October 29, 2015 at 8:28 pm

    Yep, the big rigs will probably go first. Kinda funny to think that OTR drivers will necessarily become computer jocks.

Henry Hawkins | October 29, 2015 at 9:19 pm

Hackers might have a field day with computer driven systems. I don’t pretend to know the engineering (yo, JerryB?), but it seems if a thing can be hacked it will be hacked.

—————————–

Here’s a study for consideration. It’s by U of M’s Transportation Research Institute, which reportedly has a good rep, situated as it is next to the auto-historic ‘Motor City’:

“For every million miles driven, self-driving vehicles had 9.1 crashes, compared to just 1.9 for regular vehicles, according to a report released Thursday by the University of Michigan’s Transportation Research Institute. In other words, self-driving cars were five times as likely to crash as conventional ones, and their passengers were four times as likely to get injured (with 3.29 injuries per million miles, compared to only .77 injuries in regular cars). Self-driving cars were also rear-ended 50% more often than traditional vehicles.”

http://fortune.com/2015/10/29/self-driving-cars-crash/

    Henry Hawkins in reply to Henry Hawkins. | October 29, 2015 at 9:39 pm

    Henry, will these new systems be as good and as accurate as GPS devices?

    We’ll never land a man on the moon.

    Your looking at the initial testing. The systems will become better and more robust over time. We’re just looking at the tip of the iceberg now.

    It’s inevitable.

    from the link:

    “The authors agreed that self-driving cars were not to blame in any of their crashes, and also found that injuries in those incidents tended to be less serious than in regular car accidents. Finally, self-driving cars actually have a lower fatality rate, with zero deaths resulting from their crashes.”

      genes in reply to Barry. | October 30, 2015 at 7:50 am

      From what I’ve read the google cars were being kept to roads at 35mph and less. That would result in less severe injuries.

It’s not the cars that disturb me, but how the government will inevitably find ways to exploit them and abuse their new-found powers. Cars that can be hacked or otherwise compromised so that they become useless, or surreptitiously report your location to the local police? This is probably more dangerous than firearms that rely on electronics (like ID systems) to work that could possibly be rendered inert by remote control. The inability of the citizen to control directly his or her own personal conveyance is a serious threat to liberty. The government won’t need to see your papers if you must rely on transportation systems (both “personal” like autonomous cars and public transportation) that are under government control or that can be crippled or disabled by government action.

On a less ethical but equally as interesting as a logic problem… I read a post a while back about someone that watched one of the Google cars stop in the middle of traffic. It seems that the car noticed that a cyclist on the sidewalk was standing on his pedals but not moving or only moving at a very slow pace… Standing on them waiting on the crosswalk… And the car did not know how to recognize that the cyclist was not actually going to run out in front of the car…

I any agree that correct programming would have slowed the car before going around any blind turn. The answer to the moral question would be what would current liability and criminal law require. Which choice would a reasonable man be required to make, aim for the pedestrians or the wall?

We’ll get the functional, practical assessment of SD vehicles when we see how insurance companies set premiums. Their actuaries are brutally accurate.

I believe self driving vehicles for the masses is a liberal wet dream.

Suicide bombers might be interested in them though.

Imagine a self drive vehicle hooked up to remote control like a big truck used to run down and slaughter and no driver behind the wheel for the police to bring down to halt the carnage.

If you think R/C drones are bad… Imagine a remote 18 Wheeler.
If it can self drive, it can be controlled remotely.

buckeyeminuteman | October 30, 2015 at 2:02 pm

It’s best to give all robots a 6-foot power cord. That way you have a better chance of running away when they inevitably turn on you.