Everyone is excited about the prospects of self-driven vehicles, but is the technology safe enough for the roads? There is a great deal of debate about whether self-driven vehicles are ready to be used on the road. Nevertheless, vehicle companies like Tesla have already incorporated semi-driverless technology into their cars, which drivers are currently operating in New Jersey and throughout the rest of the country.
A recent crash caused by a Tesla vehicle -- in which the driver had engaged his semi-self driving features -- shows that either the technology, or the human operators, are not quite ready for mass application. In the accident, a California driver who had engaged his Tesla "Autopilot" feature ended up crashing into the back of a fire truck.
According to individuals familiar with the incident, a semitruck was directly in front of the Tesla driver, when the semitruck began to approach the stationary fire truck. The semitruck moved in the left lane to maneuver around the fire truck. Meanwhile, the Tesla driver didn't have enough time to react. Even his computer-operated driverless software didn't have enough time to react and he crashed directly into the back of the fire truck.
Fortunately, no one was hurt. The question is: Could this crash have been avoided if the driver had not been relying on the self-driving "Autopilot" feature of his Tesla? And, was Tesla to blame or the driver? These questions will doubtlessly need to be answered by a New Jersey court, and they may help set the landscape for the years ahead as personal injury lawyers, corporate defense lawyers, judges and the legal system in general try to ascertain a basis for determining financial liability in the self-driven car crashes of the future.
Source: Los Angeles Times, "Tesla crash highlights a problem: When cars are partly self-driving, humans don't feel responsible," Russ Mitchell, Jan. 25, 2018