Fully Driverless Cars in California

California has been accepting permits for fully driverless cars on public roads beginning on April 2nd, 2018, after a rule change had occurred in the state that allowed companies to test their self-driving cars on roads. These cars operate without drivers, mirrors, and a steering wheel, which grants the opportunity for companies such as Uber, Waymo, and General Motors to improvise, and change any flaws that they may catch in their vehicles, to prevent any actual catastrophic failures that could put others at risk.

In my opinion, this is a great rule change for companies looking to test driverless cars in California. Countless accidents happen every year, and at times, includes company cars such as Uber, that were driven by a human driver. By implementing these self-driving cars in the future, it can prevent these potential accidents and assure passenger safety from such dangerous events. I have personally witnessed several events that led to huge accidents, whether it was on the highway or regular local roads, and they were all caused by human drivers. Testing self-driving cars would be able to change that, and significantly reduce the opportunity of accidents. It would be a good idea and beneficial for everyone on the road.

Source: https://www.theverge.com/2018/4/13/17235788/waymo-permit-fully-driverless-car-california

 

5 thoughts on “Fully Driverless Cars in California

  1. I love the idea of a world with self-driving cars. I personally believe that it would greatly reduce, or even eliminate the number of accidents that happen on the roads every year. I, like so many others, enjoy driving. The summer after I received my license was one of the most fun I ever had. Despite the fun I have when I’m driving, I still think that self-driving cars would be much more beneficial for safety reasons.

  2. As driver-less cars become more developed, I find that they can fix many problems such as human error. Yet in the stages they are in currently I find to road test them on public roads could prove very dangerous. As self driving cars are being tested by several companies one of which was under have scrutiny. As Tesla’s cars have been in a series of major fatal accidents the NTSB has concluded that the reason for the accident was possibly an over reliance on the autopilot. Now this brings up a new issue when it comes to Self driving cars, instead of human error when driving but now we see human error even with self driving cars. As self driving cars develop and become more popular they can’t account for the drivers around them as well as the drivers in the car who do not understand how they work. This is very similar to cars that have cruise control. As people become more comfortable to driving with it they become reliant that it will lock the speed and follow the traffic in front. As technology developed these cars are said that they can now stop when it senses the car in front stops. Yet how can we be sure it will until you hear the crunch and at that point it will be to late. As cars are evolving today so do people. We no longer can rely on people to follow the rules of the road and today we need to not learn to drive but defensively drive. That is the aspect I find most self driving cars need to be programmed with. When I learned how to drive, I was told that learning how to drive is not learning the rules of the road alone but instead learning how to drive in defense in the event someone does something stupid or something incorrect that I can remain in a safe distance. As self driving cars are developed the flaw is that although on paper they work perfectly they can not account for human mistakes. Therefore, maybe make a program that requires people to drive self driving cars or find away to program more defensive self driving.

    http://shop.nsc.org/-Defensive-Driving-Course-Self-Study-Study-Guides-P87.aspx
    https://www.nytimes.com/2017/09/12/business/self-driving-cars.html

  3. The idea of self driving cars is something I haven’t fully been able to wrap my head around. Some argue that they would cause less accidents because they limit drunk driving and other human errors and others argue that they would cause more accidents because they lack the reasoning skills that humans have. I did some research myself to form more of an opinion on the topic and this is what I found.

    https://www.lemberglaw.com/are-driverless-cars-safe/

  4. I think the idea of driverless cars sounds really cool and like a good idea, but I feel that the logistics would be complicated. It’s hard to imagine a car driving on the road with other cars without a driver and it not seeming dangerous. Car accidents stem from human error, but it is also possible that cars without a human driver will cause accidents. Machines often have malfunctions, so driverless cars would also be likely to have a malfunction. According to an article written by CRACKED, one out of every 12 cars got in accidents in California in just 6 months. This goes to show that driverless cars are likely not a safe alternative to cars driven by humans. The article also stated, “And a longer study of all accidents involving autonomous cars between 2012 and 2015, compared to those involving just regular ones in 2013, found that the former were five times more likely to get into crashes. Even when they controlled for the fact that people don’t usually report minor dings or fender benders, the self-driving cars were still two times as dangerous.”

    http://www.cracked.com/blog/why-self-driving-cars-are-tremendously-dumb-idea/

  5. While the idea that cars can drive themselves is interesting and quite fascinating, it is still a very risky subject. It is interesting that you mention how it has no steering wheel. How does it turn? There are many risk factors involved in letting driverless cars out onto the public roads. If the vehicle is hacked somehow, it can lead to a fatal accident just as if there were someone driving the car. In a recent article written by WIRED, Tesla was involved in another autopilot fatal accident. “This is the second confirmed fatal crash on US roads in which Tesla’s Autopilot system was controlling the car. It raises now familiar questions about this novel and imperfect system, which could make driving easier and safer, but relies on constant human supervision.” The key phrase from the article is that it still relies on human supervision. While there are smart cars out on the road, sometimes a human’s instinct and reaction is smarter than the car itself. Accidents happen all the time, but that is a human error that will always be a factor. Driverless cars may not always be the smartest decision.
    https://www.wired.com/story/tesla-autopilot-self-driving-crash-california/

Leave a Reply