It’s exciting — and scary — that we will be “driving” self-driving cars soon. How quickly and safely depends upon the actions of lawmakers in the next few weeks.
Today much less restrictive Department of Transportation guidelines are being announced. Last week the U.S. House passed the SELF DRIVE Act. This past Friday a similar bill was proposed in the U.S. Senate which has bi-partisan support.
While these news laws and regulations will expedite the development of self-driving technology, they unfortunately relax standards for vehicle safety. While the auto industry cheered the new rules and plan to rush new models to marketMany safety advocates have voiced objections. Why? These laws exempts manufacturers from crucial safety regulations controlling braking, airbag and steering systems.
We know that with millions of Takata airbags that had to be recalled after some exploded and shot shrapnel into drivers and passengers, killing at least 11 people, the GM ignition switches that suddenly shut engines off, and the other auto safety debacles in the past 10 years, our vehicles are already more dangerous than ever.
Thorny new legal questions
And how will personal injury litigation be affected? Here are just some of the questions lawyers and judges will have to confront:
- Who will be held responsible if there is a collision — the manufacturer, say Tesla? The software developer, say Google? The driver, say you?
- How much control will the driver have to maintain?
- Can the driver take over the self-driving mode to prevent a collision, and if he doesn’t, will he or she be at fault?
- Can these vehicles be hacked into and disabled, causing crashes?
Just yesterday Google accepted some of the liability for the crash that killed its driver last year as he was test driving a Tesla.
How will courts be able to determine liability without complicated and expensive scientific and engineering experts? What if the driver claims the software suddenly stopped working? What if it does? What if the driver fails to download needed updates? It’s a potential legal nightmare.