The recent death of a pedestrian struck by a self-driving Uber has raised many questions–most notably, liability.
Recently, a pedestrian was killed after being struck by a self-driving Uber vehicle in Arizona. This fatal crash is the first of it’s kind, involving both an autonomous vehicle and a fatality. Now, as investigators work to untangle the mess of data, questions about liability, legality, and how this could of been prevented arise.
As we have covered before, Uber, Lyft, and any other rideshare company have already opened a serious can of worms when it comes to liability issues involving damages and injuries. But after the recent self-driving fatality, new questions involving liability for autonomous cars have been raised.
After all, who is to blame? Is Uber fully responsible since it was their car that caused the death? What about the company that created the self-driving technology in the first place? Perhaps it’s the vehicle manufacturer’s fault, or the safety driver who was present in the vehicle, or some combination of the all these parties?
Additionally, the crash will almost certainly raise questions about any indemnification agreements that may have been signed by the companies working to develop self-driving cars and those working to get them on the road to make money.
WARNING: Footage may be disturbing. Video from Tempe Police via ABC Action News YouTube channel.
How did the Uber fatality occur?
On Sunday, March 18, 2018, 49-year-old Elaine Herzberg was walking her bicycle across a four-lane road when she was struck by the autonomous Uber, media reports. Herzberg was not using a crosswalk when she was struck in Tempe, a suburb of Phoenix, Arizona. She later died from her injuries at the hospital. The autonomous vehicle that struck her was traveling at 40 mph and had a human “safety driver” on-board. That driver was Rafael Vasquez, who is seen in a video of the incident looking down for several seconds as the car hit and killed the pedestrian.
Could the accident have been avoided?
After reviewing the video footage of the crash, the Tempe Police Chief said that, whether the car was driven in autonomous mode or by an actual human driver, it “would have been difficult to avoid this collision…based on how she came from the shadows right into the roadway.”
However, in regard to liability, it is not the Police Chief who makes the call of fault; that responsibility will lie with the Attorney’s Office.
For a free legal consultation, call 833-552-7274
Does it matter that the pedestrian was not using a crosswalk?
In most states, including Arizona, drivers are required to exercise a duty of care to avoid hitting pedestrians, whether or not they are using a crosswalk. Therefore, although crossing in a designated area would have severely limited the risk of injury or death for the pedestrian, it does not free the driver (or the other parties involved) from liability.
What parties could potentially be held responsible for the autonomous Uber fatality?
Since the vehicle that hit and killed Herzberg was operating as an Uber at the time, it is obvious that they could be named as a defendant in the case. Likewise, other third-parties, like the vehicle manufacturer, Volvo, and the company who created, programmed, and installed the self-driving technology, could also be held liable. Vasquez, who was operating as the safety driver for Uber, could also be named as a defendant in any future litigation.
What role will self-driving technology play in any litigation?
Under normal circumstances, car accident cases usually focus on driver negligence in order to pursue litigation against the at-fault parties. However, since there is clearly a very different angle here, namely the autonomous system, investigators and attorneys will have to look into any possible defects in design, programming, or installation. In this scenario, negligence is irrelevant, since the law only requires proof of defect to pursue a third-party in these types of litigation cases.
It will then be up to the two sides to prove that there either was a defect which caused the self-driving Uber to strike and kill the pedestrian, or that any self-driving vehicle or human driver would not have been able to avoid the crash.
Since this area of law is so new, it is still not clear what a court would have to say about the argument that ‘no other self-driving car could have avoided the collision’. After all, does that excuse the tragedy?
Where there any indemnification agreements?
An article by Reuters.com brings up the question over whether or not Uber, Volvo or any of the self-driving companies involved signed any type of indemnification agreements during their dealings to put these vehicles on the road.
The term indemnity refers to compensation for damages or loss, and in a way, to an exemption from liability for damages. Indemnity is most often brought up in contractual agreements in which one party agrees to pay for potential losses or damages caused by the other party.
Why would anyone sign one of these agreements, you might ask? To keep an agreement moving forward. For example, Uber may have signed an indemnity agreement in order to get the self-driving technology into theirs cars, or the self-driving technology company may have signed one with Volvo, or any combination of the parties involved.
This is an important question in this case since the very appointment of liability relies on any of these preexisting agreements.
So how is it possible that the self-driving technology didn’t detect the pedestrian?
Investigators from the National Highway Traffic Safety Administration and the National Transportation Safety Board are looking into this very question. In pursuit of answers, they will surely be interviewing the safety driver, reviewing the physical evidence from the scene of the accident, and looking at any and all of the self-driving car’s data. With all these modes of information, it’s likely an answer will eventually be found.
There is no doubt that manufacturers of self-driving technology and the ridershare company Uber have exhaustively considered this very issue before sending a self-driving car out onto the road. But how did the army of sensors, software and hardware, and AI programs fail?
TechCrunch.com recently published an article in which they examine how self-driving cars are suppose to detect pedestrians and other potential hazards on the road. In the article, they go over a series of possibilities that could have caused the fatal accident. They mention that self-driving Uber’s are equipped with multiple imaging hardware and software, infrared lidar, front-mounted radar, and short and long-range optical cameras, all of which should have potentially detected Herzberg.
What does all this mean for the family of the deceased?
No matter what the cause of the accident, or who was at-fault, it does not change the fact that someone’s family has been torn apart by this tragic loss. And for what? So that these companies could test out a new technology? Are we willing to sacrifice lives to get more convenience on the road sooner? As a society, we will have to face these issues, seemingly sooner than later.
Although a lot is still unsure in this case, one thing that is for sure is that there will mostly likely be a lawsuit to rectify the wrongful death of Elaine Herzberg. Who that lawsuit will be against, and to what capacity, is still yet to be seen.
Dolman Law Group Accident Injury Lawyers, PA is a personal injury firm with offices in Clearwater, St Petersburg, New Port Richey, Sarasota, and North Miami.