Share

Share on facebook
Share on twitter
Share on linkedin
Share on email

A Contest That Uber Didn’t Want To Win But Did

It finally happened, unfortunately. A self-driving car claimed its first fatality.  There will regrettably be more–that’s a given. This time it was a pedestrian who was killed. But the next time it may be a driver who has a front-end collision with another vehicle. Or a side impact. Maybe a rear-ender. Or a rollover. Who knows?

Car accidents come in many variations. But the point is this: in the starkest and coldest terms, the companies developing autonomous cars ultimately view these fatalities as nothing more than this: the gathering of data.

While I expect Uber to issue the standard “how can this happen?” and “we’re so sorry” and the usual statements of corporate contrition, money will change hands and the case will almost certainly be settled, with such payments being viewed as nothing more than the cost of doing business when developing new technologies. Ford Motors has the infamous memos whereby their actuaries calculated that it was cheaper to pay legal settlements.

It’s interesting that this happened to Uber, a company that—shall we say—is ethically challenged given its notorious past business practices. I’m not saying that Uber is the Dr. Mengele of autonomous vehicle experimentation, but it wouldn’t surprise me if they cut corners somewhere or engaged in other dubious practices that may have contributed to the accident. It’s said that unlike with autonomous cars, there are no bad drivers. Maybe. But there are bad programmers, bad decision-makers, and bad timing. And there are unique circumstances that even a bad driver may be more qualified to handle than a computer that hasn’t encountered that situation before—and its programmers update it accordingly.  But if the car gets a virus? That’s a bigger issue.

What was the operator of the vehicle doing? Texting? So how is Congress addressing the issue? By limiting people’s right to sue of course! So who can be sued by the pedestrian’s family? Uber certainly, but there may be other parties involved. The software vendor? The operator who didn’t react? Perhaps the city or state that authorized the testing of the vehicle? The contractor who built the road? Were there warning signs posted anywhere that autonomous vehicles were being tested in the area? I’m not saying that any plaintiff’s attorneys can get very creative at times when pursuing legal theories. And why shouldn’t new technologies give rise to new legal theories? Why should lawyers be constrained with what they may argue when addressing the liability that these new technologies create any more than the companies who develop these new technologies in the first place?  Things will always go wrong. But things will go right, too.

Autonomous vehicles are indeed a technology that is heading towards us like an avalanche towards a resort. It’s just unfortunate that there will be numerous real casualties—actual people hurt and killed—along the way.

Get Updates And Stay Connected -Subscribe To Our Newsletter
About Us

Sassoon Cymrot Law, LLC, is a Boston- based law firm that serves the legal needs of business and private clients in the New England region and beyond.

Sassoon Cymrot Law, LLC, is a member of Lawyers Associated Worldwide, an international association of independent law firms serving business clients in major commercial centers throughout the world.

Copyright © 2002-2021 Sassoon Cymrot Law, LLC. All Rights Reserved. | Disclaimer | Legal Notice | Accessibility

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

We're Excited to Announce!

Sassoon Cymrot Law and Grossman & Associates have joined together into one firm under the Sassoon Cymrot Law name effective May 1, 2021.