MIT game lets you decide which humans survive in self-driving car scenarios

These are the hard choices our robotic cars will soon be forced to make.
Original image replaced with Mashable logo
Original image has been replaced. Credit: Mashable

As self-driving cars slowly make their way onto our roads, how will developers help these autonomous vehicles make difficult decisions during accidents? A new MIT project illustrates just how difficult this will be by mixing gaming with deep moral questions.

The Moral Machine presents you with a series of traffic scenarios in which a self-driving car must make a choice between two perilous options.

Should you avoid hitting a group of five jaywalking pedestrians by hitting a concrete divider that will kill two of your passengers? If there's no other choice, do you drive into a group of young pedestrians, or elderly pedestrians? Do you swerve to avoid a group of cute cats and dogs, or hit a doctor, a man and an executive? With only two choices, do you hit a large group of homeless people obeying traffic laws or a small child jaywalking against the traffic light?


You May Also Like

These are the kinds of split second, moral choices self-driving cars will inevitably be forced to make, and MIT's experimental site reveals just how dark things could get. The game includes everything from pregnant women to small children, and at the end of the game you get to see how your choices stack up against others who have played the game.

So far, most of the judgments from users lean toward saving more female than male lives, saving younger people before the elderly, and saving humans over pets. However, when it comes to the question of protecting passengers versus pedestrians, players were split roughly 50/50.

Original image replaced with Mashable logo
Original image has been replaced. Credit: Mashable

You can also design your own scenarios (see cats vs. dogs).

According to MIT, the experiment was designed to provide "a platform for 1) building a crowd-sourced picture of human opinion on how machines should make decisions when faced with moral dilemmas, and 2) crowd-sourcing assembly and discussion of potential scenarios of moral consequence."

MIT doesn't specifically call its project a "game," but when you finish a full set of scenario questions, the site asks if you'd like to "play again."

Currently, these choices are just a thought exercise, but if people like Lyft co-founder John Zimmer are right, companies rolling out autonomous vehicles will have to grapple with such questions in just a few years.

In the meantime, we can all use the site to consider the implications of tasking machines with making life and death decisions for passengers and pedestrians alike.

Mashable Image
Adario Strange

.

Mashable Potato

Recommended For You
DoorDash drivers are getting paid to close Waymo car doors
Waymo robotaxi

16 of the best MIT courses you can take online for free
Hands on laptop

Elon Musk: Tesla FSD will soon become subscription-only
Inside a Tesla, a driver uses Full Self Driving.

How to survive Valentine's Day when you're heartbroken
Broken heart-shaped lollipop on a blue background symbolizing heartbreak


Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

NYT Connections hints today: Clues, answers for April 4, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

Wordle today: Answer, hints for April 4, 2026
Wordle game on a smartphone

Google launches Gemma 4, a new open-source model: How to try it
Google Gemma
The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!