A self-driving Uber struck and killed a pedestrian on Monday. It unfortunately won’t be the last time this happens.
The 49 year-old woman was walking her bike across the road in Tempe, Arizona when the SUV ran over her. A “driver” was behind the wheel of the SUV but the vehicle was set to the autonomous mode. The person sitting in the driver’s seat was supposed to safeguard against this type of collision but it clearly didn’t work.
This horrific accident highlights an important question: is this new technology ready to hit public streets?
I am a big fan of vehicle automation because it takes human error out of the equation. Self-driving cars may one day finally put an end to drunk, distracted, speeding, and tailgating car accidents. But those advantages are too far down the road at this point.
As I discussed in a previous post, these cars should not be raced onto our roadways without thorough testing and strict safety regulations. We aren’t there yet.
But where there is money to be had lobbyist abound and bad legislation often follows. This situation follows that tired pattern. The House passed the SELF DRIVE Act in September 2017, which scraps safety standards to expedite production.
The related Senate bill goes further by protecting negligent self-driving vehicle companies from liability. Maybe this latest accident will convince the Senate to slow down.
Stop or speed up?
Automated vehicles can sense a car in front to brake or an adjacent car to determine when it can safely switch lanes. GPS guides the car in the right direction. These features tend to work well for highway driving, but not on residential streets.
Inclement weather is not as easily recognizable to the vehicle’s sensors and may prompt it to simply stop, putting other motorists at risk. Worse, the sensors might not recognize people, animals, or plants and run them over.
Self-driving cars recognize common signs. But what happens if something is off? Paste a sticker on there, and the car might misinterpret the sign as a speed limit sign and take off. Scary, isn’t it.
One technology being tested — on crowded Las Vegas streets no less! — is use of a remote driver. Considering that the person behind the wheel in the fatal pedestrian accident couldn’t stop it, operating a car from a far away location is ridiculous until the technology is perfected.
Who is at fault? We may never know if arbitration law passes
Generally, the threat of large lawsuits acts as a healthy deterrent against reckless release of unsafe products. A loophole in the proposed legislation would wipe out this crucial incentive.
The AV Start Act would force people severely injured by driverless vehicles to take their claims to arbitration and they would not be allowed to file a class action lawsuit. Arbitration shifts power to large corporations, whereas class actions have traditionally empowered individual plaintiffs by allowing them to join forces and share costs. From the get-go this arbitration law would put severely injured plaintiffs at a serious disadvantage.
Arbitration agreements typically contain nondisclosure clauses. As we’ve seen in the high-profile Stormy Daniels lawsuit against Donald Trump, these confidentiality agreements can have a chilling effect that, predictably, favors the wrongdoer. Nondisclosure clauses also suppress public discourse that prevent a company’s bad conduct.
Companies will eventually cash in on driverless cars. Without the risk of lawsuits, they have every reason to rush these new cars onto the market as soon as possible, even if they are a danger to the public.
Self-driving cars need to be parked until they are ready to drive safely.
Update on 3/29/18: The estate’s claim has already been settled for an undisclosed amount of money.