Autonomous Cars Moral Dilemma

Today, advanced technology is seen all around us and will be increasing everyday in the future. One of the more popular advancements we have seen in the world are autonomous cars that will drive by themselves without you having to. These cars are programmed to understand its surrounding environment and navigates through laser lights, GPS, radar sensors, and different control systems and sensory information. Advantages of these cars are seen as clean energy for the environment as these cars would be electric, reduced costs, increased safety, and fewer car accidents.However, a more philosophical question has risen out of this: weather your self driverless car, in the case of life or death, should protect you if it means killing a pedestrian or protect the pedestrian by crashing and potentially killing you. Now this creates an uncertainty on the way these cars are programmed and even to be bought in the future. An article from The Guardian, science editor Ian Sample, explains a very interesting survey on which car would you want to drive in: the one that saves you or the one that kills you to save someone else. Sample states, “In one survey, 76% of people agreed that a driverless car should sacrifice its passenger rather than plough into and kill 10 pedestrians. They agreed, too, that it was moral for AVs to be programmed in this way: it minimised deaths the cars caused. And the view held even when people were asked to imagine themselves or a family member travelling in the car.” Now morally this makes sense, that you want to save as many people as you can. But later in the article, when these people are asked if they want to buy the car that is programmed to kill the driver if it comes to that decision, statistics say they won’t buy it. Instead they would rather not have that car at all. Now fully autonomous cars aren’t fully out there yet, but the future holds many opportunities for new technology to make a difference. But, this is the world’s first time seeing this type of stuff, so it’ll be interesting what types of laws and regulations new technology like autonomous cars must have programmed in them. Soon we’ll be finding out.

 

 

 

One thought on “Autonomous Cars Moral Dilemma

  1. “Should your self driverless car, in the case of life or death, protect you if it means killing a pedestrian or protect the pedestrian by crashing and potentially killing you?”
    I find it very interesting that we allow this question to be so normal, since we should never have to consider a question like this when buying a vehicle. To me the answer quite clear: autonomous cars should not be on the market until completely perfected. Unfortunately due to human error, perfection in this field is almost impossible to attain.
    The fact that we have to consider who should survive in a life or death situation, is really quite concerning. The more we force this product, the more accidents will occur. Recently there was an accident in a Tesla self driving car where the driver was killed. After the amount of accidents that continue to happen with autonomous cars, it should make us rethink if they are a product that is truly worth it. I read an article by USA Today, where they went into detail about this same topic, and who should decide who gets killed in a life or death situation. I feel as though this generation gets caught up in the future and technological advances, which makes us forgets to take a step back and re- evaluate products. The fact that we have to consider who should survive in a life or death situation, is quite concerning.

    https://www.usatoday.com/story/money/cars/2017/11/23/self-driving-cars-programmed-decide-who-dies-crash/891493001/

Leave a Reply