You’re driving down a busy city street when a child chases a ball into the road directly in front of you. You have a nanosecond to choose: Hit the child in the street or veer onto the sidewalk that’s teeming with people?
Autonomous vehicles will face those sorts of choices, raising questions about how they’ll make decisions about who might be hit or who to protect—car passengers or pedestrians? Think you know which way they’ll go? Try MIT’s Moral Machine, billed as “a platform for gathering a human perspective on moral decisions made by machine intelligence, such as self-driving cars.” This time out, it’s just a game…or is it?
Try the Moral Machine here.