After an investigation by the NTSB was released this past week, a fatal car accident in Florida became the first-ever recorded fatality where an involved vehicle was using an autonomous driving system.1
According to reports, a tractor-trailer crossed in front of a Tesla Model S traveling at 74mph. The Tesla was operating with its “Autopilot” mode engaged, but both the autonomous system and the driver failed to see the tractor-trailer and apply the brakes or swerve to avoid a collision.
While the preliminary report does not affix blame to either driver or Tesla, it has sparked a wide array of responses in the press.
Several former employees of Tesla have raised safety concerns in interviews,2 and many have claimed that Tesla’s autonomous driving system is not ready for production use. Others, however, have cautioned that accidents involving autonomous driving systems are unavoidable and that on average, these systems do more good than harm, and actually save lives, citing a recent example where the system prevented a collision with a pedestrian.
Tesla, for its part, has responded to these concerns in a blog article, where it stresses the fact that the software is entirely optional, and repeatedly warns drivers to maintain awareness of the road.
Who is responsible for a car accident involving an autonomous driving system?
The future for autonomous driving is yet to be made clear. This tragic accident raises an important question: who is responsible when an advanced autonomous driving system makes the wrong decision and someone is injured or killed? The answer is not yet clear, but likely, the courts will have to weigh in at some point in the near future. The NHTSA has requested more information from Tesla regarding the operation of its Autopilot system.