Not too long ago, a Tesla driver and passenger in Canada were caught sleeping while the car was self-driving with a speed of over 90 mph on a highway. The occupied front seats were fully reclined, meaning neither the driver or the passenger would be in the ready position to take over control of the car if something went wrong. An article by The Verge states that “officers began to pursue the vehicle with their emergency lights flashing, at which point the vehicle ‘automatically began to accelerate’” (Hawkins). This seems extremely dangerous and if I were in the car, I know I would be terrified. According to an article by BBC News, “Tesla cars currently operate at a level-two Autopilot, which requires the driver to remain alert and ready to act, with hands on the wheel” (BBC). If the system doesn’t sense any hands on the steering wheel, it is supposed to disable itself. Therefore, either the system had issues or the owner must have found a trick to keep Autopilot working. An example from the past was to “wedge an orange against the wheel to simulate the pressure of a human hand” (Hawkins). It’s so strange to me how both the driver and passenger in this case were willing to trust the car with their lives when its system can be so easily fooled or have problems with functioning properly.
This brings up questions about how safe Tesla cars really are. I believe that in any self-driving car, there should always be an experienced driver that’s ready to take the wheel if something goes wrong. The idea of being able to do whatever you want, like sleeping, while a car does all the driving for you would definitely be cool, but as of right now, it is clearly not safe to put all your trust into a self-driving car.