Put Our Experience To Work For You

Free Initial Consultations

Self-driving cars and issues of crash liability

On Behalf of | Jan 20, 2017 | Car Accidents |

It seems clear to most people that self-driving cars are the next big goal for automakers. The technology has been improving rapidly, and while no fully autonomous vehicles are commercially available yet, many cars have features that assist drivers.

Unfortunately, driver-assistance technology can be more dangerous than fully autonomous vehicles when drivers are not clear about which driving tasks they are still responsible for. That may have been the problem behind a May 2016 crash that killed an Ohio man as he was traveling in his Tesla Model S.

The driver was using “Autopilot,” a driver-assistance feature with an arguably misleading name. According to investigators from the National Highway Traffic Safety Administration, the driver-assistance software was working as designed, so Tesla was not deemed liable for the crash from a design and manufacturing standpoint. However, the NHTSA did stress that all automakers need to be careful about warning and educating drivers about the limitations of driver-assist features. At the very least, companies need to name their driver-assist technology in ways that do not falsely imply full autonomy.

Tesla’s Autopilot is reportedly good at preventing rear-end collisions, but not nearly as reliable when it comes to cross-traffic crashes. And the fatal accident from last year involved the latter scenario.

If fully autonomous vehicles will eventually be the standard driving experience, we could see many tragic accidents during the transition from fully manual to fully autonomous driving. Drivers may be confused about what their responsibilities are when their vehicles are in driver-assist mode, and they may not know that their vehicles have certain limitations in accident scenarios. Until these problems are worked out, it may be in everyone’s best interests to use this technology with extreme caution.

FindLaw Network