this is basically the same logical and ethical problem that automotive self-driving developers are facing (that's why there isn't any car that is fully automatic yet).
Where if the car were to be put in an accident in which the only course is to hit the soft spot (in this case people and high chance of killing them) or the hard spot (low chance of survival for the passengers).
Should the car kill the passengers to save others or should the car prioritize the lives of the passengers?
but whenever deciding the fate of others, the rule that the person in charge must abide is "few is sentimental, many is statistical"
most would have to default to choosing the latter.
Where if the car were to be put in an accident in which the only course is to hit the soft spot (in this case people and high chance of killing them) or the hard spot (low chance of survival for the passengers).
Should the car kill the passengers to save others or should the car prioritize the lives of the passengers?
but whenever deciding the fate of others, the rule that the person in charge must abide is "few is sentimental, many is statistical"
most would have to default to choosing the latter.