Feature Stories Campus Events All Stories

Faculty Spotlight: Josh Fairfield Josh Fairfield is the William Donald Bain Family Professor of Law and the Director of Artificial Intelligence Legal Innovation Strategy.

Fairfield2024-scaled-800x533 Faculty Spotlight: Josh FairfieldProfessor Joshua Fairfield

Questions about technology’s potential to replace lawyers and judges are not new. In 1977 an article in the Georgia Law Review asked, “Can/Should Computers Replace Judges?” Similar questions are being posed more frequently amid the constant acceleration of Artificial Intelligence.

Professor Josh Fairfield wonders exactly what people mean when they say they can replace judges, lawyers, and the law with computers. “They can’t,” he said, “and they don’t know they can’t. It’s my job to talk about why that is, and to teach lawyers to have the confidence of their profession. As lawyers, we are specialists in social technology and in rules hammered out in conversations between humans.”

Those conversations take place in natural language, not computing language. “There are certain things you can say in natural language that you can’t code, and there are certain things that you can code that you could never say.”

The problem, said Fairfield, is that technologists often think they can do both because they don’t understand the limits of their field. “It’s true that computers can perform many of our tasks, especially now that AI has really hit,” Fairfield said. “They can write a love letter, for example, but they can’t mean it.”

Internationally recognized for his scholarship in law and technology, Fairfield is the William Donald Bain Family professor of law at Washington and Lee and directs the law school’s strategy for artificial intelligence. He’s been researching and writing about topics in technology and the law for years. He is the author of two books, including “Runaway Technology: Can Law Keep Up” (2021, Cambridge University Press).

“There is this old idea that law is too slow to handle rapid change in technology,” said Fairfield. “What I do is help students and lawyers, businesses and government agencies, handle the bleeding edge of technological change.”

Fairfield has personal experience with that bleeding edge. His mother was a lawyer, and his father was a computer scientist and a founder of Rosetta Stone, one of the earliest language learning software programs. Originally called Fairfield Language Technologies, the family business changed its name to Rosetta Stone to match its major product. Fairfield was a history major at Swarthmore College when the company was getting started. He worked for Rosetta Stone throughout college, then spent two years as the company’s full-time director of research and development after graduation.

When he entered law school at the University of Chicago, he was initially focused on intellectual property. That evolved into an interest in digital property. After a clerkship with the U.S. Court of Appeals for the Sixth Circuit in Louisville and a stint as an associate at Jones Day in Columbus, he started teaching and writing on digital property. He spent a year in a visiting position at Columbia Law School and then two years at Indiana University School of Law before joining W&L in 2007.

Just as technology has expanded in myriad directions, Fairfield’s scholarship has ranged from issues of big data privacy to the safety of children in virtual worlds, from cryptocurrency to the right of individuals to disconnect from their workplaces, to name only a few. He has consulted with the White House Office of Technology Policy, the Homeland Security Privacy Office, the CIA, the Department of Defense and other organizations on virtual worlds, privacy, online economics, and cryptocurrencies. On a research sabbatical in New Zealand last year, he coauthored an op-ed about a new law giving Australian workers “the right to disconnect” — i.e., refusing contact from their employers outside working hours — and argued that New Zealand’s government should follow suit or risk falling behind in responding to rapidly changing technologies.

According to Fairfield, there is a thread that knits together these seemingly disparate activities: all are technologies that have gotten away from law.

Take the issue of privacy, for instance. Fairfield notes that privacy laws have changed dramatically over the past 30 years because everything we say and do is recorded by devices around us — cars, tablets, smartphones, video game consoles — and the data is used primarily to sell us things.

And AI is what Fairfield calls “the grown-up version” of privacy issues.

“AI and privacy may seem to be very different from each other,” he said. “In fact, they are different ends of the same pipeline: the gathering point, where human data is collected, then used to train machine-learning algorithms, and then the output point, where the machine-learning algorithm is doing some task that traditionally was performed by humans.

“What I focus on is the impact technology has on humans, how human rules respond to technological shift, and how human rules can actually get ahead of technological change,” he said. “Suppose I murder somebody with a lightsaber? It’s still murder. Law can handle technological change just fine as long as we see the real moves people are making and why.”

As for the future of lawyers and judges, Fairfield said members of the legal profession should see AI as a set of impressive capabilities. But those capabilities will always be at the mercy of what humans choose to train the systems to do.

“We have to look at ourselves, not at the technology,” he said. “The problem is not that we should be afraid of AI; the problem is that we should be afraid of not addressing what is wrong with us that causes AI to be like this. Remember, it just learns from us. There are limits to what AI can do.”

“For example, why is it that even though AI is perfectly capable of writing a judicial opinion, no one wants an AI to be their judge? What is that intuition that we need humans to pass judgment on humans?”

Fairfield said his scholarship helps train students on the capabilities and limits of artificial intelligence, which will make them better lawyers. At the same time, he said, his teaching improves his scholarship by grounding him in the practical questions and concerns of digitally native students.

“Many of my best ideas have come from basic questions from students that reveal they are using a technology differently than even the designers expected,” he said. “These organic and emergent technology uses often become flashpoints for the development of new technology law.”

In October 2023, Fairfield was appointed as the law school’s inaugural director of artificial intelligence legal innovation strategy. He is also involved with the AI hub in the Harte Center for Teaching and Learning on the undergraduate side. In this role, he is focused on training students to serve companies and firms that need to know how the capabilities of AI work and what its limits are—and he is working with the AI hub to develop principles for AI use on campus.

With so much having been written about the misuse of AI to cheat on papers and tests, Fairfield said the baseline rule is that AI should facilitate what a student is doing and not do it for them.

“It’s not just because it’s unethical,” he said. “It’s also because it’s important for us as humans to be producing law that is reflective of human concerns.”

Fairfield believes W&L is better positioned than many institutions to deal with the threat to academic integrity that AI has raised because of the Honor System and the fact that the community has had conversations about what it means be ethical.

“Although Silicon Valley may think there is a technological solution for everything, the answer to this dilemma is us — that is, the rules we develop to live together,” he said. “The best way to fight the invasion of AI is a strong community, which is what we have here at W&L.”

If you know any W&L faculty who would be great profile subjects, tell us about them! Nominate them for a web profile.