Article of Interest

Article of Interest

The Safety, Moral, and Legal Implications of Self-Driving Vehicles
By: Tennessee Walker

Self-driving vehicles are here—driving alongside us. They are no longer a futuristic idea. What this means from the safety, moral, and legal perspectives is an ever-growing topic of discussion—both locally and nationally. I am often asked, “you’re an attorney, what do you think?” Well, here you go.

Safety and Morality Issues

A self-driving car recently malfunctioned in Kaufman, Texas—driving into a guardrail at 80 miles per hour. That incident shed local light on self-driving car issues. The current and future application of artificial intelligence (including self-driving vehicles) was also the topic of a recent WIRED Magazine interview of Joi Ito (entrepreneur and MIT Media Lab director) and President Obama.

The WIRED article focuses heavily on the issue of morality. As drivers, we make moral decisions each time we get behind the wheel. Sometimes those decisions are literally a matter of life or death. The decision to use the fully automated function in a self-driving vehicle is also a decision to turn over moral decision making to a machine. Should self-driving vehicles be programmed to minimize the potential death toll? What if that means steering the vehicle into a wall (likely killing the owner and occupant of the vehicle) to save multiple pedestrians who are crossing the street? Moral issues such as this one are the topic of some noteworthy current research:

The results are interesting, if predictable. In general, people are comfortable with the idea that self-driving vehicles should be programmed to minimize the death toll.

This utilitarian approach is certainly laudable but the participants were willing to go only so far. “[Participants] were not as confident that autonomous vehicles would be programmed that way in reality—and for a good reason: they actually wished others to cruise in utilitarian autonomous vehicles, more than they wanted to buy utilitarian autonomous vehicles themselves,” conclude Bonnefon and co.

And therein lies the paradox. People are in favor of cars that sacrifice the occupant to save other lives—as long they don’t have to drive one themselves.

See https://www.technologyreview.com/s/542626/why-self-driving-cars-must-be-programmed-to-kill/ (emphasis added).

On October 19, 2016, Tesla announced that “all Tesla cars being produced now have full self-driving hardware.” Huge announcement, OR IS IT? Tesla’s announcement goes on to provide key details:

Before activating the features enabled by the new hardware, we will further calibrate the system using millions of miles of real-world driving to ensure significant improvements to safety and convenience. While this is occurring, Teslas with new hardware will temporarily lack certain features currently available on Teslas with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control. As these features are robustly validated we will enable them over the air, together with a rapidly expanding set of entirely new features….

See https://www.tesla.com/blog/all-tesla-cars-being-produced-now-have-full-self-driving-hardware.

In my opinion, Tesla appears to be taking a step back. At a minimum, they are acknowledging (1) potential safety issues with their first-generation autopilot hardware, and (2) the need for extensive safety testing before rolling out the next generation. Call me crazy, but I do not think the Tesla announcement will do anything to change what appears to be a reluctance by the general public to trust self-driving technology.

The Hot Button Legal Issues

We have all heard and pondered the question of “If a tree falls in the forest and no one is around to hear it, does it make a sound?” What about the question of “If I am hit by a self-driving vehicle that is operating on auto-pilot, who is at fault?” Is it the non-driver owner and/or occupant of the vehicle? Is it the vehicle manufacturer? Am I just out of luck?

I feel confident that the vehicle manufacturer will stand to be legally responsible for autopilot malfunctions. But, even though he/she was not in control of the vehicle, I also believe that the owner and/or occupant of the self-driving vehicle remains legally responsible. For your sake and mine, let’s hope I am right.

Why is holding the owner/occupier of a self-driving car legally responsible important? Because pardoning the owner/occupier of self-driving vehicles from liability and limiting your legal rights to claims against the vehicle manufacturers would make justice cost prohibitive. Claims against manufacturers are typically products liability actions—as opposed to general negligence actions. Products liability actions are expensive, time consuming, and complicated—hence the fact that they are cost prohibitive for many injured people. The last thing our society needs is to further chip away at our already lacking access to justice.

So, how do I arrive at my opinion that the owner/occupier of the self-driving car is also legally responsible—even though he/she was not actually in control at the time of the collision? I ask the following question: “Is using the auto-pilot function in a self-driving vehicle a negligent act?” I say yes.

Don’t get me wrong, we trust our lives to technology in numerous instances—stop lights are automated, computers analyze and report on blood work, commercial planes have auto-pilot functions, etc. So what makes self-driving vehicles different? I think it comes down to the morality issue. With self-driving vehicles, it is not merely a question of whether they will be dependable and work properly, it is a question of whether they will make proper moral decisions.

Is the general public ready to rely on machines to make moral decisions? I say no. Set aside your personal thoughts/beliefs about the technology and ask yourself, “Would a Texan of ordinary prudence (i.e. a man or woman of average intelligence) entrust his/her safety—and the safety of her fellow Texans on the road—to a fully automated driving process?” Again, I say no, and the above-cited research backs me up. If I am right (and I think that I am), then the use of autopilot is a negligent act. Doing that which a person of ordinary prudence would not have done under the same or similar circumstances is negligence under Texas law. More importantly, if I am right, our society will avoid a further decrease in the access to justice.

Tennessee Walker is a plaintiff’s trial attorney with Patterson Law Group in Fort Worth.


Views and opinions expressed in eNews are those of their authors and not necessarily those of the Texas Young Lawyers Association or the State Bar of Texas.

Submit an Article

Interested in writing an article for eNews?


Contact Us

Connect With Us