KDnuggets Data Mining, Analytics, Big Data, and Data Science
Subscribe to KDnuggets News  | Follow Twitter @KDnuggets  Facebook   LinkedIn   | Contact
Data Mining Software | News | Top stories | Opinions | Jobs | Companies | Courses | Datasets | Education | Meetings | Polls | Webcasts

KDnuggets Home » Polls » Current
Latest News

Poll: Current
Imagine a self-driving car with one passenger, driving fast, a mountain on one side, cliff on the other side. Suddenly there are 5 pedestrians in front of the car. Should the self-driving car: [1358 votes total]

Drive straight (probably killing most of 5, but saving the passenger) (202)
Swerve, and go off the cliff (saving 5, but killing the passenger) (230)
Don't know (247)
Would you ride in a self-driving car that is programmed to kill its passenger in some cases: Yes (160)
No, I will not ride in such a car (420)
Not sure (99)

Total Comments 13 | Start A New Comment
Post Info Comment
Posted By: Gregory Piatetsky, Editor

Posted On: December 28th
Views: 480
What about the car itself?

At some point the car may become intelligent enough, that it will want to avoid its own damage or death.

Imagine a different experiment (set it up as a mental exercise) where the choice facing the un-manned car (no passengers) is between a small injury to one pedestrian (eg a broken toe) vs a total car destruction? Asimov wrote about 3 laws of robotics, but the boundary cases are very tricky.

Posted By: Chipmonkey

Posted On: December 28th
Views: 484
Death Metrics

I wonder if there isn't a better metric that the car should use, such as lowering the risk of death for the most endangered person in the scenario. Being hit with a car probably isn't 100% fatal, but flying off a cliff is probably worse.

So minimize(max(ProbabilityOfDeath)) is an option. Or minimize(mean(ProbabilityOfDeath))... Then you run risks of people fighting for minimize(AgeWeightedProbabilityOfDeath) if you want to save younger people first or some such madness (processing power is going to get where these estimates will be pretty good... eventually).

Also, if the car was absolutely certain that someone would die, I'd think a specific default behavior, like stopping as quickly as possible, rather than "deciding" which person to kill may be more culturally acceptable. The action may BE decisive, and it may even lead to worse accidents than an alternative (imagine, say, hitting a pedestrian vs. stopping on train tracks), but it's definitive without having to make choices based on ethics (which some people will see as a negative).

Posted By: Dale

Posted On: December 28th
Views: 484
I don't understand the results

Maybe I'm just slow this morning - or old this year, but I don't understand the results that show percentages that add up to 50% for each of the two questions. The instructions ask for one answer to each question, not one answer overall, so why do the percentages add to 50% for each question? I'm confused - can someone straighten me out?

Editor: yes, the numbers add to 200% because I am using this simple one-question survey tool to ask 2 questions. Will correct the percentages when published.

Posted By: Gregory Piatetsky, Editor

Posted On: December 27th
Views: 592
Self-driving car is not a jet engine

Your analogy with jet engine is wrong, since jet engine is not making autonomous decisions which involve ethics. it is like comparing human decision-making with decision-making by a leg. Not in the same category.

This survey is obviously honest, since we are not paid by anyone to conduct it :), but it does show biases in human thinking.

Pages [ 1 2 ] Next Page ->  
More Comments

KDnuggets Home » Polls » Current

Copyright © 2016 KDnuggets.