Tesla Self Driving Car Crashes and Kills Driver

A self driving car created by Tesla crashed in California killing its driver. Tesla said it was operating on autopilot. Tesla is a company that focuses on cars, especially electric cars. In recent years, they have been creating self driving cars similar to other cars. The accident took place on highway 101 in Mountain View. According to Tesla, the driver was a apple software engineer Wei Huang. One thing that was noted was that the driver “had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision”. I was never really a big fan of self driving cars because I knew that there could be several mistakes within the technology. I truly belive that humans should just continue to operate their own vehicles. In addition to that, “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider … but the vehicle logs show that no action was taken.” This makes it seem like the driver was at 100 percent fault.

In the article the concrete highway divider was previously damaged increasing its impact on the car according to Tesla.The way I am interpreting this is that Tesla is blaming everything else for this accident.

After reading many stories of self driving cars malfunctioning, this makes it scary to operate one or use one. I personally will never use one.

https://www.theguardian.com/technology/2018/mar/31/tesla-car-crash-autopilot-mountain-view

Ramar Parham

5 thoughts on “Tesla Self Driving Car Crashes and Kills Driver

  1. I completely agree with you on the fact that cars shouldn’t be given the power to control our lives. Even though the concept of auto pilot or self driving cars might be cool, they have many disadvantages on them like It takes away many jobs that gives a lot of job insecurity for everyone. Anyone or everyone could hack into your car and control it giving it security issues. How in the world can you program a self driving car to follow the rules of the road when those rules vary from state-to-state?

    That question is beyond my pay-grade. But I do know roads are governed by the states and technology is regulated by the federal government. Expect to see some conflicts of interest.

    Most regulators lack the expertise they need to oversee this technology effectively. Due to this problem, Google might get to write their own rule-book. Should they have so much influence?

    Overall, I feel the potential safety benefits of self driving cars outweigh the potential risks. Time will tell whether my perception is correct or not.

    http://www.globalautotransportation.com/main-disadvantages-self-driving-cars/

  2. Self driving cars have started being developed into something much bigger in the past couple of years, with companies like google and Tesla, we have seen how this invention is trying to be incorporated in the daily life of societies and communities. However not everyone likes the idea of this new technologies around because of accidents like the one mentioned in the post above. Tesla vehicles are not only facing issues regarding self driving cars, in this same accident the safety of electric vehicles was also questioned.
    After this self driving car created by Tesla crashed it caught fire, because of the fire the highway had to be closed down for 5 hours and firefighters had to go in with special suits because of the intensity. Experts argue that the severity of this fire was caused by the fact that the car was an electric vehicle, they wonder if the special batteries had to do with the intensity of it. This issue sheds a light on the debate if electric vehicles are safer than gasoline powered vehicles or not.
    This issue puts Tesla in a very harsh position since not only the safety of self driving cars is questioned but also the base of their company which is basically electric vehicles.
    I personally believe that the cause of this fire was primarily because of the crash, that is something that can be avoided. In regards of the fire, both this types of cars can end up in the same position, and if electric vehicles are harder to put off then in a future where this cars will be more common, more firefighters will know how to deal with this in a more effective way. I personally don’t think this issue has to be a big concern for Tesla since this shouldn’t discourage consumers in buying EV cars that are more beneficial to society and the world

    Sources: https://www.livescience.com/62179-tesla-fire-cleanup-danger.html

  3. I agree with you on this issue. Although self-driving cars can be useful in many ways, I feel like the risks outweigh the benefits. According to a blog published on Harvard University’s Graduate School of Arts and Sciences’ website, a fully autonomous self-driving car needs to have the capacity to do three things that human drivers do on their own: perceive, think and act. Although companies who are making these cars claim that they are able to do all of these things, I am still skeptical about the concept of artificial intelligence being able to operate on the same level as the human mind.

    According to the same article, however, self-driving cars will likely reduce accidents on the road due to the fact that they are always operating at their maximum ability unlike distracted or drunk drivers. Still, software engineers do not fully trust the cars to assess every possible situation and obstacle successfully. There have been many setbacks in testing these vehicles and it is clear that a lot of research still needs to be done. It’s going to be a while before people are comfortable with these vehicles being on the road.

    Source: http://sitn.hms.harvard.edu/flash/2017/self-driving-cars-technology-risks-possibilities/

  4. I certainly agree with this post and believe that self driving cars are creating a lot of problems. It is tough to tell however what that best option is. On one hand they create an experience that allows cars to be aware of their surroundings to deliver a safer drive and minimize accidents as well as try and limit the amount of drunk drivers. On the other hand there is a huge security and risk factor with people hacking them and having the ability to control someone else’s car because they are connected to the internet. In this example we see that the technology of the car failed and there wasn’t even an outside controller that made this happen. Tesla is not the only company at fault though. Recently in an article that I was reading, Uber experienced a problem as well. One of their self driving cars killed a pedestrian in Arizona. I think that these companies haven’t totally figured out how to get these cars on the street and need to come up with some type of backup or way of solving these problems when the technology fails. It is a topic that is going to be discussed a lot in the near future and will have a lot of controversy. Personally, I don’t think I will ever purchase one let alone get in one. I enjoy the experience on driving and being in full control of the vehicle. I think it would be awesome if they would be able to figure it out and solve these problems but there is a long road ahead and a lot of malfunctions that need to be taken care of first.

    https://www.usatoday.com/story/tech/2018/03/19/uber-self-driving-car-kills-arizona-woman/438473002/

  5. This is going to become such a big topic of conversation with all this new technology coming out in the world especially with these autonomous cars. I agree with you, I am not the biggest fan of cars groin g on full auto-pilot because I believe there is so much more correct decision making a human can make compared to programmed cars with set expectations. I recently posted my own blog about the moral dilemma of an autonomous car either killing a pedestrian and saving your own life in a crash, or the opposite, saving the pedestrian and crashing that would potentially kill you. There are many different philosophical questions that are now rising with a new sight of the technology being created. Here is the article you can take a look at that describes the ethical dilemma more.

    https://www.theguardian.com/science/2016/jun/23/will-your-driverless-car-be-willing-to-kill-you-to-save-the-lives-of-others

Leave a Reply