Self-Driving Cars
Imagine a wreck on the interstate. A big hauling truck has a number of boxes attached to the back and they immediately start to fall onto a self-automated car. Now, you as a programmer with a task of having to evaluate three different options of this wreck: veer right onto a moving motorcyclist, left into an SUV, or stay in the middle lane and allow the boxes to fall onto you with a chance of a causality occurring. Today in class, this is exactly what we discussed.
If I were the programmer, I would have the car "choose" left into the SUV as it was the most ethical option with the least number of causalities. Overall, I think coming at this thought experiment is difficult. Hearing everyone's opinion today was very enlightening at the ways they approached the problem, as one person brought up liabilities while another accounting for the cars around it to be either automated or self-driving. I could share more and more numerous and creative ways this experiment was approached but instead I will share my thoughts on the most social and policy-related issues that could arise with these self-driving vehicles.
Considering how the social aspect of things, most people would agree on how to approach said problem with an agreement or majority reaching a consensus on what most would perceive as the "normal" behavior. To me, I know my response was not like everyone else as some suggested merging, going left, or in my case, just stopping and taking my chances with the boxes. As someone who has had experience with wrecks, I can tell you firsthand that it solely on instinct. While I may have ethically in the thought experiment chosen the left way as the programmer, I know for a fact, that I would've chosen differently in real life. Being in a wreck, I panic, so in the spur of the moment, I would just stop and hope for the best, as that's what I did in my former wreck. Everyone's answer is different! For the policy-related issues, I have come to concept of the driver of the self-driving car to just accepting the terms that they put for themselves in this situation. They knew the risks of getting in a wreck. To have said "solution" I think that there should be a wavier or a contract that holds this driver and any said passengers that chose to ride in the vehicle, liable for their actions.
Good thoughts, Lilah! I wonder, do you think that because you would likely react in a "dumb" car by stopping and hitting the boxes, is that how the self-driving cars should be programmed? Swerving into the SUV seems like a bad idea, since it punishes someone for making a prudent choice. But it does maximize safety. What do you think the programmer should do?
ReplyDelete