Author Topic: Moral Machine  (Read 1834 times)

micah

  • A real person, on the Internet.
  • Ass Wipe
  • Posts: 6915
  • Karma: +58/-55
  • Truth cannot contradict truth.
    • micahj.com
Moral Machine
« on: October 18, 2016, 10:47:12 AM »
With the onset of self driving cars and other AI, who decides the morality impossible decisions?  Should the car's primary objective be to protect it's passengers, or those around it?

Here's a fun test for yourself:  http://moralmachine.mit.edu/
"I possess a device, in my pocket, that is capable of accessing the entirety of information known to man.  I use it to look at pictures of cats and get in arguments with strangers."

ober

  • Ashton Shagger
  • Ass Wipe
  • Posts: 14310
  • Karma: +73/-790
  • mini-ober is taking over
    • Windy Hill Web Solutions
Re: Moral Machine
« Reply #1 on: October 18, 2016, 11:13:10 AM »
I did that a week ago.  I don't recall the exact results but I think I favored younger over older people and I favored killing less people in most cases.  Oh, and I apparently don't give a fuck about killing pets.

charlie

  • Jackass In Charge
  • Posts: 7903
  • Karma: +84/-53
Re: Moral Machine
« Reply #2 on: October 18, 2016, 12:38:30 PM »
Some of these aren't making sense... You have brake failure. Do you ram into a barrier and kill your occupants or swerve and kill a dog? Umm... that dog ain't gonna stop the car. If you swerve and kill it you're going to keep going and kill more things.


Also, that dog was "flouting the law by crossing on a red signal". ???


Ok... my results:

http://moralmachine.mit.edu/results/754707444
« Last Edit: October 18, 2016, 12:52:27 PM by charlie »