I came across an interesting article today that explores an intriguing ethical dilemma, one related to a subject I'm very big on: cars. There is a thought experiment, that is quickly needing to be answered, that basically asks: if you (the sole occupant) are riding in an autonomous car, and suddenly you are in a situation where you are about to hit a group of people, should the car's programming be set to save just yourself by hitting the group of people, or should it take action to avoid them that would cause catastrophic damage to your car and kill you in the process?
With the expanding capabilities and usage of autonomous cars, this is a serious question that needs to be considered by automakers and the people who use self-driving cars. There are two ways to look at the dilemma: the logical solution would be to let your car kill you in order to save the greater number of people, but the emotional solution would be to save yourself at the expense of the others. It is not often that people need to make this sort of decision, and many people do end up self sacrificing. But what if your car needs to make the decision for you? Is it more ethical for the automakers to take the logical action, or is it more ethical for them to protect their customer (you) at the expense of others in the wrong place at the wrong time?
This would be an interesting topic for discussion for sure. The link to the article is here:
http://www.sciencedaily.com/releases/2015/06/150615124719.htm
No comments:
Post a Comment