Moral Dilemma
1 min read

Moral Dilemma

Given a situation were we have self-driving cars, what is the "right" decision?

The self-driving car is in a situation where it is on a crash course to hit 5 people (likely killing all of them) or can do a hard turn and crash into a brick wall instead (likely killing the driver).

What decision should it programmatically make?

Since the driver purchased the self-driving car, should the car prioritize the life of the driver?

Mastodon