“Active Cruise Control with Stop & Go,” “Super Cruise,” “Lane-Centering Steering,” “Eyesight,” “Traffic-Aware Cruise Control.” Ready or not, autonomous vehicle technology has arrived. Just in time too, because as it turns out, statistics demonstrate that humans are fallible creatures. According to the US Department of Transportation, 94% of automobile crashes can be traced back to driver error. Never fear, technology is here.
Today’s cars can “see,” accelerate, stop, park, stay in their designated lanes and even make lane changes without driver involvement. Each of the foregoing advancements has been developed, implemented and marketed as a safety enhancement. Oh, the irony. To remedy maligned human conditions exposed by our tech obsession, we just need more tech. Lest we renew our commitment to paying attention and respecting the laws of physics. – End old person rant.
Without a doubt, the promise of a reduction in accidents due to speeding and drunk driving is universally appealing. But there’s a problem. Ever notice that when your cellphone gets an update, it never seems to work quite right? There’s an issue with new technology; it is invariably flawed and then updated or “patched” to correct the flaws.
Ushering in a new age of automotive transportation, one in which we can blissfully ignore any culpability for a lack of engagement, is going to be difficult because it won’t happen overnight. There’s going to be trial and error, updates and “patches.” When your tech experiment involves passenger vehicles weighing 3,000 lbs moving in excess of 60 mph, software flaws and other errors inevitably cost lives. Car crashes, injuries and fatalities aren’t going entirely disappear in a matter of days, weeks, months, years or even decades. Even our fully autonomous vehicular overlords of the distant future will inevitably be flawed to some degree, which means people will be hurt. When people get hurt, someone is going to get sued (presumably by our autonomous attorney overlords).
Historically, automobile accidents have generated lawsuits based in tort on a theory of negligence. The legal argument is generally summarized as follows: Bill had a duty to operate his vehicle with a reasonable amount of care. Bill breached his duty to Alice when he negligently rear-ended her vehicle. Alice suffered damages as a result of Bill’s negligence, so Bill should be liable to Alice. Pretty straight forward.
Conceptually, it is possible to apply the traditional negligence paradigm to a self-driving car, provided you can make the leap from suing the car’s “occupant,” to suing the car’s manufacturer. A plaintiff could argue that a car manufacturer has a duty to sell automated vehicles which operate with a reasonable amount of care. If the plaintiff suffers damages as a result of being in an accident where an autonomous car didn’t operate within a reasonable standard of care, the car’s manufacturer may be liable.
One of the first U.S. lawsuits for an accident involving a self-driving car was filed in a San Francisco federal court earlier this year. In that case, a Chevrolet Bolt in cruise automation mode struck a motorcyclist. The Bolt was in heavy traffic when it started to merge into the left lane. At the same time, a vehicle in the left lane slowed down, so the autonomous car moved back into its original lane. Upon moving back into its original lane, the Bolt hit the motorcyclist. The motorcyclist was knocked to the ground and suffered minor injuries. Although there was a GM employee in the car, he didn’t have his hands on the wheel.
The plaintiff in the above-referenced case sued GM on a negligence theory, stating that GM “owes a duty of care in having its self-driving vehicle operate in a manner in which it obey the traffic laws and regulations,” but that GM breached that duty because “its self-driving vehicle drove in such a negligent manner that it veered into an adjacent lane of traffic without regard for a passing motorist.” GM made an undisclosed settlement to resolve this case, so many of the legal ramifications of the autonomous vehicle development process remain wholly unanswered. Indeed, some legal scholars have opined that the age of autonomy will necessitate a shift from vehicular negligence to product liability.
Like the technology itself, the law governing autonomous vehicles is in its infancy. Some states, such as Arizona and California, are enacting laws to encourage manufacturers to test and operate on their public roads. In 2015, Arizona Governor Doug Ducey signed an executive order which instructed state agencies to take “any necessary steps to support the testing and operation of self-driving cars on public roads within Arizona.” The order required autonomous vehicles to have an “operator” capable of directing the vehicle’s movement if necessary. However, in March 2018, the order was updated to permit fully autonomous vehicles to operate on public roads without an operator, provided the vehicle achieves a reasonably safe state following a failure of autonomous systems. Notably, on March 18, 2018, a pedestrian was struck and killed by an automated Volvo SUV in Tempe, Arizona. The police report indicted that the pedestrian was at fault, having stepped off a median into the roadway, ending up in the path of the SUV while jaywalking.
As of today, human technical prowess is outpacing the legal creativity that will be required to address complications associated with autonomous machines, including autonomous vehicles. Questions abound: If a computer is programmed to operate only within a designated standard of care, can it be negligent? What are the implications for the auto manufacturers, the software engineers (e.g. Google), and commercial end users (e.g. Uber, Lyft)?
Although investors are pumping billions of dollars into autonomous vehicles, the public’s acceptance of the technology may be slow to materialize. While people are quick to point out flaws in others driving, they are typically reluctant to give up control. As such, the timeline for a cultural shift into automated vehicular travel is still yet to be established. For now, we are left to imagine a world full of electronic chauffeurs in the abstract.
If you have any questions regarding the integration of autonomous vehicles and the law, please feel free to contact me, Benjamin Dill at firstname.lastname@example.org, or 804-377-1272, or Steve Setliff at 804-377-1261, or email@example.com.