If you haven’t heard of the Trolley problem, it goes something like this. A trolley is speeding down the tracks right towards a group of five people who are tied up and unable to get out of the way. If the train hits them, it is certain death. Luckily you are standing right next to a lever. If you pull this lever it will force the trolley onto a side track avoiding the people ahead but consequently sending the trolley directly towards another person who is also tied up and unable to move. The question is, which is the more ethical choice? Do nothing and let the five people die or intervene and pull the lever sacrificing the person on the side track? Most people are likely to pull the lever but what if the story is slightly changed.Imagine the trolley is barely visible in the distance but heading directly toward the same five people.
This time instead of a lever, you notice a large man standing dangerously close to the edge about half a mile from the group. A thought pops into your head. You could push the person onto the tracks killing him but forcing the train to stop before it reaches the group of people. Is saving the five people still the most ethical choice?It’s not hard to see how situations like these might apply to the road. The thing is when you’re driving a car, the reactions you make in a stressful situation are just that. Reactions. However, as driverless cars become more and more of a reality there are questions that need to be asked. Responding to a potential crash by swerving into an intersection is no longer a reaction but a pre-programmed decision.
That decision can carry a lot of weight. In some instances it is the deciding factor in who lives and who dies. Let’s say you go to the car dealership to purchase a new car. Would you prefer the model that was programmed to save as many lives as possible? Or maybe you would rather a car that protects you at all costs. The car that is designed to minimize casualties is morally a better choice but that’s not really how the world works. For the most part, humans act in their own self-interest, meaning most people would rather the car that incentivizes the passengers safety above all. That is not to say people are inherently selfish because I don’t think that is the case. Wanting to benefit your well-being is a natural instinct that everyone possess.
However, with a topic like this it’s very easy to get caught up in all the little hypotheticals without stepping back and looking at the bigger picture. When you ask people what they think the most dangerous activities are, driving isn’t usually mentioned. Instead it is common to hear things like air travel (even though there were 0 passenger jet related deaths in 2017 according to the Aviation Safety Network). This is strange considering driving kills 3,287 people everyday and over 1.
3 million a year making it the leading cause of death for americans ages 1 to 34. Not only that but it is also the leading cause of long term disability. The good news is that self driving cars are the answer.
As the technology becomes more advanced and wildly adopted around the world, millions of lives will be saved. The elimination of human error is probably the single biggest factor in why these cars will be so safe. According to Stanford law school approximately 90% of all crashes are caused by human error. Driverless cars will bring an end to most of these accidents. Driverless cars will also come equipped with greatly improved breaking, enabling them to avoid collisions much more effectively than any human could. Of course this is where those pre-programmed decisions come into play. However, the chance of a car deliberately killing one person to save another is incredibly small and unlikely.
The reason behind this is that unlike human drivers, driverless cars will be able to get out of a situation before it escalates to the point where the car is forced to makes a morally difficult decision. That’s why it’s important to realize the ethical problems I talked about before aren’t really that big of a concern. It’s not hard to set up scenarios that make self driving cars look like a complex moral dilemma (This is made apparent by a recent surge in articles talking about driverless cars in a very negative light) when in reality that couldn’t be further from the truth. There are reasons fully autonomous cars haven’t hit the market yet but I think it’s imperative to embrace this technology because every day we don’t, thousands of people are dying from accidents that are easily preventable.