Uber Crash Raises Liability, Data Access Questions

Bloomberg Law’s® extensive network of reporters and editors provides comprehensive authoritative coverage of significant developments and issues  in  product safety and liability...

Martina Barash Washington Reporter Steven Patrick Washington

By Martina Barash

Uber’s and others’ potential liability for the death of a pedestrian in Tempe, Ariz., hit by a self-driving test vehicle raises complicated legal and data access questions.

Local and federal investigations are just getting underway so the facts aren’t well understood yet in the Uber Technologies crash. But the quick appearance of video from the vehicle’s front-mounted camera points to a significant factor for accident probes and litigation alike in this new age of automated vehicles: the availability and control of vehicle data.

“I haven’t heard any suggestions of this being anything other than a terrible crash incident,” Jason Levine, executive director of the Center for Auto Safety in Washington, told Bloomberg Law.

But after determining the facts related to the crash, “the question is, is there a defect in the software? In the hardware? If so, is Uber responsible? Is Volvo responsible?” People aren’t ready to wrestle with those questions—and there are no regulations governing the automated systems, he said.

“Dozens” of companies were likely involved in creating the Uber test vehicle’s technology, Levine said.

Uber should handle its data carefully because its credibility is at stake, Professor Bryant Walker Smith of the University of South Carolina School of Law in Columbia, S.C., told Bloomberg Law.

Uber, after the accident, should take pains to preserve the vehicle information that it collected. It “should have backed off from all of their onboard and off-board systems,” and access should be “under the supervision of a government investigator and probably another third party,” he said.

Richard McCune of McCune Wright in Ontario, Calif., who represents motor vehicle plaintiffs including some who have sued Tesla Inc. alleging sudden acceleration in some of its advanced vehicles, said his experience litigating against automakers suggests investigators should “trust but verify” vehicle data.

Litigation could come under several possible theories, even though the test vehicle isn’t marketed to the public, Smith said. Such claims, which would most likely be based on negligence or product liability-type theories, could hinge on any of a number of “decisions made in the course of the entire testing program.”

Or, instead, it could come down to something common in many vehicle accidents: simple human error, either on the part of the pedestrian or the back-up driver, or a combination of the two.

The Tempe police have said video from the Uber’s camera shows the pedestrian suddenly stepped out in front of the vehicle from the median. The human driver seated behind the steering wheel first became aware of the victim, Elaine Herzberg, 49, when she was hit, according to the police. The incident took place at night, at about 10 pm on March 18.

“There certainly can be circumstances where it doesn’t matter if it’s human or computer,” and the accident would have happened anyway, plaintiffs’ attorney McCune said.

But, while computers aren’t expected to be perfect, glitches that we accept in a desktop computer present “a different scenario on four wheels.”

Uber couldn’t be reached for comment.

To contact the reporter on this story: Martina Barash in Washington at mbarash@bloomberglaw.com

To contact the editor responsible for this story: Steven Patrick at spatrick@bloomberglaw.com

Copyright © 2018 The Bureau of National Affairs, Inc. All Rights Reserved.

Request Product Liability & Toxics Law News