| ,

Are judges and the justice system ready for driverless cars?

by and

Vol. 105 No. 2 (2021) | Judicial Independence | Download PDF Version of Article
A driverless vehicle

Autonomous vehicles have long ignited the American imagination. Increasingly, they have caught the attention of lawyers and judges as well.

The integration of autonomous vehicles (AVs) represents a startling shift for society and is expected to disrupt a range of social and economic systems. Drivers, insurance companies, AV manufacturers, and government officials will look to the legal system to navigate this uncharted terrain and to settle the criminal and civil disputes that will inevitably arise.

Judges will play an integral role in this process. While juries will certainly be responsible for determining the outcome of many of these cases, judges can and will be the final arbiters in situations where a jury is not deemed necessary. And judges’ decisions regarding admissibility of evidence, appropriateness of expert testimony, and a host of other substantive and procedural issues will undoubtedly affect these cases.

How and what do judges think about AVs? Recent scholarship has identified a need for judges to learn more about the emerging technologies that are likely to become the focus of new cases.1 However, until now, academic literature has not examined judges’ existing perceptions and attitudes regarding AVs. Judges play a critical role in the introduction of any new technology, particularly when that technology is implicated in accidents or when it conflicts with existing technologies. One might assume that judges would tend to look less favorably upon new technologies, perhaps because the technology is unfamiliar and seems to bring with it a host of legal problems. History, however, teaches differently. At the dawn of the automobile era in the early 20th century, when horse-drawn carriages were still omnipresent, judges were often tasked with the decision of whether to attribute blame for an incident to an automobile or to a horse’s response to an automobile.2 Although judges could have found fault with the new technology, their rulings affirmed automobiles as a lawful means of participation in traffic, setting the expectation that carriage drivers would adjust to new motorized traffic patterns.3 It is easy to see how blaming automobiles for disrupting horse traffic might have had a chilling effect on the introduction of this new technology.

Autonomous Vehicles and Liability

According to the National Highway Traffic Safety Administration (NHTSA), 36,096 people were killed in car crashes in 2019.4 A 2018 study showed that a staggering 94 percent of crashers were caused by driver error (the remaining 6 percent were caused by manufacturer defects in the automobile, environmental interventions, or unknown reasons).5 Despite a few highly publicized crashes involving AVs,6 driverless vehicles are well positioned to reduce this number.7 But AVs will create complex legal questions. In an automobile crash with one or more AVs, legal liability for the crash will be murky. Juries will be required to determine whether a person is at fault for intervening, or perhaps for not intervening. Civil liability cases will determine whether AV manufacturers or the manufacturers of subcomponents are at fault. Artificially intelligent systems may be scrutinized for decision-making processes that result in one fatality to the benefit of another who is left unharmed. Additional complexities arise in cases where an AV collides with a vehicle driven by a person.

Opinions are mixed regarding whether the existing legal framework is sufficient to adjudicate AV cases.8 One option is to treat AVs like entirely new products, creating new laws to define what is expected of AV manufacturers and identifying guidelines for liability.9 Other scholars have proposed ways of integrating AV technology into existing laws. For example, Sophia Duffy and Jamie Hopkins have proposed looking to canine law as a viable solution for AVs: Owners would be liable for their vehicles, but only if the owner had sufficient information about the vehicle to know that it was dangerous and thus demonstrated negligence in the vehicle’s operation.10 Kyle Colonna has similarly suggested that current liability structures are helpful in this context, comparing AVs to automated transportation systems like elevators (largely out of the passengers’ control), airplanes (which have been using autopilots for most of the 20th century), or trains and trams (which often run with minimal or no human supervision).11 The current tort system handles such cases with relative ease.12 Researchers have also proposed “no liability” systems, where national tax-supported insurance funds reimburse injured parties without determining fault, thus removing potential liabilities from AV manufacturers and AV drivers while promoting the production and adoption of AVs.13

Against this backdrop, states have begun to respond to the explosion in AV technology by proposing new legislation and issuing executive orders.14 As of July 2020, 40 states and the District of Columbia had determined that self-driving vehicles are street legal in some capacity.15 To date, 256 laws have been passed across the country pertaining to AV use,16 with every indication that more will come.17

 

Graph showing the number of laws that have introduced legislation governing autonomous vehicles. The chart shows exponential growth in AV law after 2014.

Graph showing the states that have enacted AV laws. In most states, AVs are legal or are legal with a safety driver. Only a few states don't have AV laws to date. (September 2021)

Another common line of research inquiry for AV technology revolves around autonomy and privacy. Researchers have noted that AV technology might eventually curtail driver autonomy by removing features that would allow humans to take control over their vehicles.18 Additionally, scholarly sources describe how AV data collection will likely include a driver’s destination history, future itinerary, and more.19 For example, information about where a car is parked could be used to extrapolate personal information about an AV owner or user, such as wealth or purchasing habits.20 While there are questions about the scope of the information gathered and who will have access to that information, there is a looming concern that private industry and government organizations could use this data inappropriately and to the detriment of end users.21

Despite the diverse range of opinions and propositions regarding the integration of AVs into the legal system, there is little research that assesses judges’ knowledge of AVs, or their perceptions of current law governing AV-related legal issues. Such research would illuminate a population that will wield tremendous influence over the application of current law to cases involving AVs. Moreover, judges might have insight into which legal frameworks are currently sufficient and which might need further elaboration. Finally, assessing judges’ knowledge about AVs can help identify opportunities for continuing education efforts, training, or outreach programs.

What do judges know about autonomous vehicles?

To assess judges’ knowledge and beliefs regarding AVs, we convened three focus groups during the second quarter of 2019. These focus groups included judges who attended courses at the National Judicial College (NJC), an international center for judicial education associated with the University of Nevada, Reno. The judges came from different jurisdictions around the United States and were in Reno for two weeks to attend courses in general jurisdiction (a course for new judges), impaired-driving case essentials, and an advanced specialty course devoted to traffic law. The opportunity to participate in a lunch hour focus group was relayed to them by the coordinators of the respective courses. For two focus groups, judges were invited to join the researchers in a nearby conference room after their lunch. For one of the three focus groups, we provided volunteer judges with a light lunch. We provided no incentives or compensation for participants. Approximately 30 judges were surveyed in total. Though demographic self-descriptions were not solicited, about a quarter of the participants were women. Each meeting was approximately 45 minutes long, though some participants joined while the discussion was already in progress.

The format involved the moderator asking broad questions that defined a topic of discussion, though the judges occasionally took the discussion in directions not previously anticipated by the researchers. In each of the three sessions, a faculty member and a graduate student took notes on the content of the discussion. Sessions were not recorded. The primary topics of discussion were the benefits and harms of AVs; privacy regarding AV data; cybersecurity and terrorism; current laws and the potential adoption of new laws; liability; and the potential for legal problems in the future, especially during a time of transition when both non-autonomous and autonomous vehicles share the road.

As a group, the judges were approachable, talkative, and engaged, and they showed considerable interest in the topic of AVs. While none of the judges had presided over an AV case, they showed an acute ability to take a broad topic related to AVs (e.g., liability) and narrow the scope of inquiry to its ethical considerations and its relationship to existing legal frameworks. Interestingly, the judges often touched upon the same topics that legal and behavioral scholars describe in existing literature on AVs.

The following is an overview of our findings regarding the judges’ attitudes towards AVs and the incorporation of AVs into society.

Adoption, Benefits, and Harm

Overall, the judges were very optimistic about the integration of AVs into U.S. society as well as societies throughout the world. Although one judge expressed skepticism about whether or not people would adopt AVs, the rest seemed to take for granted that AVs would be commonplace in the future. In general, the judges expressed optimism about the incorporation of AVs into modern society, and generally did not reveal the kind of hesitation sometimes expressed by members of the public.22

The judges were also optimistic about the effects of this transition. They were keenly aware of the dangers of automobiles and familiar with statistics for vehicle fatalities as well as the causes of those fatalities. Nearly all of the judges believed that AVs would decrease traffic fatalities substantially and would improve mobility for older Americans. During two of our focus groups, some judges shared that they themselves would be prime beneficiaries of this new technology because of their advanced ages. These positive observations offer a different view than much of the existing research, which focuses on humans losing their autonomy by delegating it to their vehicles.23

The judges also believed that AVs would be influential in the shipping and delivery industries, but they were more cautious around this issue. They expressed concern about the loss of jobs inherent in such a transition and suggested that training might be necessary to help displaced workers find alternative sources of employment. However, none of the judges was strictly opposed to the adoption of autonomous shipping or delivery trucks. Rather, the judges believed that these changes were inevitable and that people would have to adapt to technological progress. Beyond a fascination with the technology itself, the judges quickly moved to the broader societal impact of this technology.

Data, Privacy, and Security

Each focus group raised questions about data, privacy, and security. The judges expressed concern about how data generated by AVs would be managed. In particular, they were wary of companies taking too much control over data in cases where that data might illuminate the causes behind an AV crash. One judge suggested that AV manufacturers might want to keep such data from the public, as a way of hiding the manufacturers’ own culpability for the incident or to keep trade secrets from other AV manufacturers. In multiple meetings, judges brought up the idea that federal organizations like the NHTSA or the National Transportation Safety Board (NTSB) might need to assume responsibility for regulating AV manufacturers and perhaps provide guidelines about how data from accidents could be retrieved, such as via a “black box”-type system similar to those currently used in aircraft. This would give the justice system access to some basic information about an AV at the time of an incident, such as the location, speed at the time of the crash, and route information. One judge, who also had ample experience as an accident investigator, pointed out that in the current legal system governing automobile accidents, there is neither a commonly agreed-upon data format nor a consistent policy for sharing data with government entities, automobile manufacturers, or the people involved in the accident. Clear standards and formats for accident-related information, as well as the distribution of that information, could help elucidate civil liability in future AV-related accidents.

The judges also had questions about who owns the data generated by AVs. It was presumed that AV manufacturers would control the majority of the data. The judges did not have a consensus opinion about what manufacturers could do with the data or what rights individual drivers should have regarding data they specifically produce. Similarly, judges were somewhat reticent about expressing strong opinions about who owned this data or who should have access to it. They compared it to other forms of technical data, such as cell phone or personal computer data, which often require a warrant in order to be reviewed. However, the judges generally did not express a position on whether a warrant could be issued to an AV manufacturer for personal data without the AV driver’s consent.

Perhaps the most important issue to judges was that of data security. In each of our meetings, at least one judge brought up cybersecurity and the potential for hacking AVs. This was described in terms of national security, with concern over whether a hacker could infiltrate an entire fleet of cars to cause large scale traffic accidents all over the country at the same time. The judges also voiced concerns about a so-called “lone wolf” terrorist attack via a car programmed to deliver an explosive device or to run into a crowd of people.

Broadly speaking, alongside any previously expressed optimism, the judges identified a number of different arenas in which AVs would pose privacy and security challenges.

Liability

The judges predicted several areas of potential legal issues regarding liability. They specified the need for a “bright-line distinction” in liability cases between situations in which the AV (i.e., the vehicle’s artificial intelligence) is in control, and those in which a human being is in control of the vehicle. One of the judges posited that if the vehicle is in control, then the vehicle is the driver and must be treated as such in terms of attribution of blame. Some of the judges similarly believed that, in the case of fully autonomous vehicles without driver control, liability would solely fall on the manufacturer. However, the judges wondered whether AV users could be liable for an accident if they had not kept the vehicle’s software up to date.

The judges stated that determining liability in cases could become increasingly difficult, especially if many AVs are connected and transmitting information to one another, which could mean that AV systems not directly involved in an accident could nevertheless be complicit in negative outcomes. They also expressed concern that new laws might target AV end-product manufacturers with “deep pockets,” rather than contracting companies or component part manufacturers that might be more culpable but less financially affluent.

Additionally, the judges wondered if we would move to a system where only vehicles, rather than drivers, are insured. Such a system could create perverse incentives, since it could be financially detrimental for a human to take control of an AV, even if the AV was functioning in an obviously irrational manner. However, judges pointed out that this system might benefit the trucking industry by reducing trucking companies’ potential liability, to the extent that AV systems were less likely to cause damage for which trucking companies would be deemed responsible.

Judges were consistently cautious about making any sweeping assertions about liability, preferring instead to examine liability on a case-by-case basis. This affirms the need within AV social science research for specific case vignettes rather than generic questions regarding fault. Matters of liability, especially when and how that liability is attributed to drivers, owners, and manufacturers of a vehicle, are better evaluated in specific contexts.

New Laws and the Judicial System

The judges generally believed that expert witnesses in AV cases would be evaluated by juries in much the same way they are now in other complex cases involving medical, scientific, or technical witnesses. The judges believed that it would remain the attorneys’ responsibility to provide witnesses that could be persuasive and comprehensible in order for juries to fully understand the background of an AV accident and to determine any legal ramifications.

One judge expressed concern that AVs might be programmed in such a way that AVs would end up demonstrating a racial bias. Specifically, the judge wondered whether, if minorities were under-represented during software development, perhaps due to developers’ implicit biases, that such biases would be reflected in how the software interacted with the world. It is plausible that complex deep learning systems that require extensive training can demonstrate biases when the training data sets do not demonstrate sufficient diversity.24 Such a misstep could perhaps result in an AV being more likely to hit racial minority pedestrians than white pedestrians.

The judges were somewhat divided on the issues of intoxication and AVs. Most judges thought that AVs would reduce incidents of people driving under the influence of alcohol or drugs. But judges disagreed on whether the law should permit an AV passenger to be intoxicated in a fully autonomous vehicle. Some judges believed that, because the car is presumably a superior driver, intoxicated passengers are within their right to be driven home by a vehicle with no oversight. However, other judges believed that the primary passenger in an AV is the responsible party and therefore must be coherent enough to take the reins from the AV in the event that it malfunctions. Within the available time, focus group discussions never reached consensus on this issue, suggesting that this might be a difficult hurdle for the legal system in the future.

Old Laws and New Technology

The majority of judges stated without equivocation that new laws are needed to deal with issues specific to AVs. If drivers cannot be assumed to be in control of their own vehicles, determining who the responsible party is in an accident will likely become more difficult. A few judges believed that existing liability laws might be able to cover some instances of AV accidents. Most judges agreed that during a transitional phase, while AVs are slowly adopted over time, current legal frameworks could be adapted to the legal issues posed by AVs to create a standard-setting body of case law. That said, the majority of judges believed that new federal legislation would be needed to prevent state laws from developing in incompatible and inconsistent ways. They believed that the initial period of mass introduction of AVs would pose the greatest challenge for the judicial process because new legal frameworks would just have started to evolve. Once AVs are normalized and commonplace, judges said, the judicial system is likely to become adequately equipped to deal with AV cases.

In sum, although judges agreed that the judicial system will adapt, there was also agreement that some level of legal uncertainty will persist until a new body of law develops.

Emerging Problems and Future Opportunities

Key points emerged from our study, many of which suggest future opportunities for education and research:

  • Additional laws must be created to deal with the future incorporation of AVs into society and the inevitable accidents that will occur as a result. There are difficult questions ahead regarding liability in AV accidents, data privacy and security, and legal standards regarding substance use for AV passengers. New laws may preempt some of these issues before they enter the courtroom.
  • Widespread adoption of AV technology will impose massive changes for courts across the country. Currently, the majority of automobile cases before municipal and similar courts (e.g., tribal courts) relate to traffic issues. If AV technology succeeds in making driving safer and rule violations less frequent, caseloads are likely to drop, perhaps to the extent that courts currently inundated with such cases will need less funding and staff.
  • Objective AV data will provide more transparency regarding which party is at fault and could motivate people to settle some disputes without the involvement of the legal system. As a consequence, only the most complicated and difficult cases would require full adjudication by courts — meaning judges and legal personnel will need to be particularly knowledgeable in this area.
  • Judges need more education regarding AV technology and the types of legal issues that will arise in the near future. Now is an excellent time to educate judges — before AVs have been widely adopted. The judges in our focus group were receptive to learning more about AVs and related legal and philosophical issues. Based on our sample, we predict that many judges from around the country would be excited to participate in training and education related to autonomous vehicles if given the opportunity. Law students are more likely than ever to learn about legal issues involving technology and artificial intelligence in law school, but because of evolving technologies their knowledge will likely be outdated by the time they become judges. Judges should have the opportunity to expand their knowledge about AVs as part of judicial education requirements that exist in most jurisdictions.25
  • Although there has been a steady stream of scholarship at the intersection of law and AV technology within the past few years, there is a surprising dearth of research that actually incorporates legal officials. We posed broad questions to our focus group; additional research is needed to focus on specific legal issues that may arise. We encourage more in-depth research with judges in order to illuminate the thought processes of these decision-makers and to give this topic the directed attention it deserves.
  • Researchers are just beginning to examine how people attribute blame and responsibility for autonomous vehicle crashes and artificially intelligent systems more broadly.26
    We encourage this research and its application to groups relevant to the legal domain, specifically judges. Judges will be key players in charting a new path for AVs in society and the law.

May We Suggest: 10 Things Judges Should Know About AI

 

Footnotes:

  1. Jeff Ward, 10 Things Judges Should Know About AI, 103 Judicature 12, 17 (2019).
  2. Kyle Graham, Of Frightened Horses and Autonomous Vehicles: Tort Law and its Assimilation of Innovations, 52 Santa Clara L. Rev. 1240, 1242 (2012).
  3. Id. at 1248–52.
  4. Nat’lHighway TrafficSafety Admin., Traffic Safety Facts Research Note – Overview of Motor Vehicle Crashes in 2019, https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/813124, 1 (2020) (last visited June 12, 2021).
  5. Nat’lHighway TrafficSafety Admin., Traffic Safety Facts Crash Stats, Critical Reasons for Crashes Investigated in the National Motor Vehicle Crash Causation Survey, https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812506, 2 (2018) (last visited June 12, 2021).
  6. Richard Gonzales, Feds Say Self-Driving Uber SUV Did Not Recognize Jaywalking Pedestrian in Fatal Crash, NPR(Nov. 7, 2019), https://www.npr.org/2019/11/07/777438412/feds-say-self-driving-uber-suv-did-not-recognize-jaywalking-pedestrian-in-fatal-.
  7. Nat’lHighway TrafficSafety Admin., Automated Vehicles for Safety, https://www.nhtsa.gov/technology-innovation/automated-vehicles-safety (last visited June 12, 2021).
  8. Jeffrey R. Zohn, Note, When Robots Attack: How Should the Law Handle Self-Driving Cars that Cause Damages, 461 U. Ill. J.L. Tech. & Pol’y 461, 469–72 (2015).
  9. Id.
  10. Sophia H. Duffy & Jamie P. Hopkins, Sit, Stay, Drive: The Future of Autonomous Car Liability, 16 SMUSci. & Tech. L. Rev. 453, 469–71 (2013).
  11. Kyle Colonna, Note, Autonomous Cars and Tort Liability, 4 J.L. Tech. & Internet81, 91–109 (2012).
  12. Id.
  13. Carrie Schroll, Splitting the Bill: Creating a National Car Insurance Fund to Pay for Accidents in Autonomous Vehicles, 109 Nw. U. L. Rev. 803, 822–33 (2014).
  14. Nat’lConf. of State Legislatures, Autonomous Vehicles / Self-Driving Vehicles Enacted Legislation, https://www.ncsl.org/research/transportation/autonomous-vehicles-self-driving-vehicles-enacted-legislation.aspx (last visited June 12, 2021).
  15. Jeremy Laukkonen, Are Self-Driving Cars Legal in Your State?, Lifewire (July 7, 2020), https://www.lifewire.com/are-self-driving-cars-legal-4587765.
  16. See supra note 14.
  17. Nat’lConf. of State Legislatures, Autonomous Vehicles State Bill Tracking Database, https://www.ncsl.org/research/transportation/autonomous-vehicles-legislative-database.aspx (last visited June 12, 2021).
  18. Jack Boeglin, The Costs of Self-Driving Cars: Reconciling Freedom and Privacy with Tort Liability in Autonomous Vehicle Regulation, 172 Yale J.L. & Tech. 171, 176–80 (2015).
  19. Dorothy J. Glancy, Privacy in Autonomous Vehicles, 52 Santa Clara L. Rev. 1171, 1188, 1198 (2012).
  20. Lisa Collingwood, Privacy Implications and Liability Issues of Autonomous Vehicles, 26 Info. & Commc’ns Tech. L. 32, 36 (2017).
  21. William J. Kohler & Alex Colbert-Taylor, Current Law and Potential Legal Issues Pertaining to Automated, Autonomous and Connected Vehicles, 31 Santa Clara HighTech. L.J. 99, 120–32 (2015).
  22. Kareem Othman, Public Acceptance and Perception of Autonomous Vehicles: A Comprehensive Review, AI & Ethics 1 (2021); Chris Tennant, Sally Stares, & Susan Howard, Public Discomfort at the Prospect of Autonomous Vehicles: Building on Previous Surveys to Measure Attitudes in 11 Countries, 64 Transp. ResearchPartF: TrafficPsych. & Behav. 98 (2019).
  23. See, e.g., Boeglin, supra note 18; Glancy, supra note 19.
  24. Kimmo Karkkainen & Jungseock Joo, FairFace: Face Attribute Dataset for Balanced Race, Gender, and Age for Bias Measurement and Mitigation, Proceedings of the IEEE/CVFWinter Conf. on Applications of Comput. Vision 1548, 1551, 1555 (2021).
  25. Evan Murphy, Markus Kemmelmeier, & Patrick Grimes, Motivations, Barriers, and Impact of Continuing Judicial Education: A Survey of U.S. Judges, 57 Ct. Rev. 40, 41 (2021).
  26. See, e.g., Christopher J. Copp, Jean J. Cabell, & Markus Kemmelmeier, Plenty of Blame to Go Around: Attributions of Responsibility in a Fatal Autonomous Vehicle Accident, CurrentPsych., DOI:10.1007/s12144-021-01956-5 (2021); Ryan M. McManus & Abraham M. Rutchick, Autonomous Vehicles and the Attribution of Moral Responsibility, 10 Soc. Psych. & Personality Sci. 345 (2019); Daniel B. Shank, Alyssa DeSanti & Timothy Maninger, When Are Artificial Intelligence Versus Human Agents Faulted for Wrongdoing? Moral Attributions after Individual and Joint Decisions, 22 Info. Communc’n & Soc’y 648 (2019).