
AI Conversations on Campus W&L’s forward-thinking mindset surrounding the emergence of AI leads to engaged classwork, research and discussions.
On Sept. 19, Washington and Lee University hosted its inaugural PLAI Summit, a daylong gathering designed to spark dialogue across disciplines about one of the most pressing forces reshaping our world: artificial intelligence.
The event, held in tandem with the university’s Young Alumni Weekend, took place in the Houston H. Harte Center for Teaching and Learning, home of W&L’s new PLAI Lab — short for “Prompting, Learning and Artificial Intelligence” — a hands-on space in Leyburn Library where students, faculty and staff can experiment with AI tools under the guidance of campus experts. Throughout the day, the PLAI Lab was open for drop-in experimentation, letting participants test image generators, explore large language models and ask hard questions in real time.
“People are engaging with AI at many different levels,” said Sybil Prince Nelson ’01, P’28, assistant professor of mathematics at W&L, who served as emcee for the day’s events and who also serves as the campus AI Fellow. “I hope the summit helps those who haven’t yet explored AI to understand its capabilities. For those already using it, I hope it offers a clearer view of where the technology is headed, especially in higher education.”
The summit featured a panel of young alumni now helping shape AI-driven industries: Ethan Fischer ’20, vice president of strategy at K Health, Logan Brand ’20, CEO of Selium Solutions and former Microsoft product manager, and Kaitlyn Brock ’20, financial accounting advisory services senior consultant at EY.
Fischer foresees AI transforming medicine. His company built predictive models trained on real clinical data to automate parts of the consultation process.
“The goal is to free doctors to have more direct, informed conversations with patients,” he said.
For Fischer, W&L’s liberal arts foundation was the best preparation for working in the rapidly evolving AI landscape.
“It taught you how to ask questions and be curious,” he said. “You know how to engage, synthesize and communicate.”
Brand, who has closely followed AI’s development in his career, echoed that theme.
“Adaptability is one of the highest-valued traits for people to be successful going forward, maybe even more so now because of how quickly things are changing,” he said.
Both alumni stressed that the best outcomes come when AI augments human skill, not when it is used to try to replace it.
“There will be a bifurcation of outcomes,” Brand predicts. “People who use these tools to develop themselves and improve will thrive. People who use them as just a means to an end will not.”
JT Torres, director of the Harte Center, says the goal of these conversations is to give attendees the confidence to approach the AI wave with curiosity.
“Our overarching goal is to empower attendees to thoughtfully incorporate AI into their work while preserving the creativity, critical judgment and ethical reasoning integral to a liberal arts learning environment,” Torres says.
Among the most anticipated sessions was the faculty panel moderated by Nelson featuring Josh Fairfield, William Donald Bain Family Professor of Law and Director of Artificial Intelligence Legal Innovation Strategy at W&L Law, and Jeff Schatten, associate professor of business administration at W&L. The conversation ranged from the transformative potential of artificial intelligence to the ethical and pedagogical challenges facing universities today.
“AI forces us to ask fundamental philosophical questions about consciousness, responsibility and justice. These are not computer science questions alone. They are questions for philosophers, historians, artists and lawyers. They are liberal arts questions.”
~ Josh Fairfield, William Donald Bain Family Professor of Law and Director of Artificial Intelligence Legal Innovation Strategy at W&L Law
Schatten, author of a new book titled “AI Will Take Your Job (and It’s for the Best),” advocated for a proactive approach that still engages students’ brains. To illustrate his framework, Schatten described what he calls “the button.”
“There’s this magic button that can do a lot of work for us,” Schatten said. “As a university, we need to work both with and without the button. Without it, students and faculty experience the struggle that is necessary for cognitive development. With it, we prepare them for a world in which AI is running our economy. I tell my students directly: ‘There are times when using AI is a disservice to you, because you’re short-circuiting your own development. But there are other times when not using it would leave you behind.’ The key is deliberate, transparent decision-making about when and why we use the button.”
Fairfield raised questions of governance and accountability.
“We are watching the emergence of AI as not just a tool but as an agent,” Fairfield said. “These systems are beginning to act in the world on our behalf — negotiating contracts, managing transactions, even making autonomous decisions.”
Fairfield emphasized that the stakes go beyond efficiency or convenience.
“AI is fundamentally about power — who wields it, who controls it and who is excluded from it,” he said. “We must be vigilant, because, without careful regulation, the benefits will accrue to a small handful of corporations while the costs are borne by everyone else.”
At several points, the panelists circled back to what makes Washington and Lee, and the liberal arts model more broadly, distinctively positioned to grapple with AI.
“We are in the business of producing new ideas,” Schatten said. “Increasingly, those ideas will be co-intelligence — collaborations between humans and AI. That doesn’t replace the human work. It makes our role as thinkers, as ethicists, as creators even more important.”
Fairfield agreed, stressing the need for moral imagination.
“AI forces us to ask fundamental philosophical questions about consciousness, responsibility and justice,” he said. “These are not computer science questions alone. They are questions for philosophers, historians, artists and lawyers. They are liberal arts questions.”
The summit aligned with a crucial moment when conversations about AI are everywhere, particularly in higher education, and the day’s events highlighted the thoughtful approach to on-campus dialogue on AI and pedagogy that have been happening across campus since the technology emerged in prominence.
Li Kang, assistant professor of philosophy at W&L, said that a sense of both awe and urgency led her to design ChatGPT and Philosophy, a course she piloted in Winter 2022 and taught again this past Spring Term. The class structure reflects Kang’s conviction that students need both timeless philosophical frameworks and the latest real-world examples to think about the implications of AI’s rapid development; readings pair classic texts from philosophers such as John Searle and Daniel Dennett with up-to-the-minute reporting on AI’s social effects.
“I want students to compare human and artificial intelligence — creativity, morality, consciousness,” Kang explains. “If you want to ask whether AI can have consciousness, you do need to understand what consciousness really is, even for us. If you want to understand whether AI can be moral, you need a good understanding of what morality is for the human race. All these questions mirror back to who we are and how we understand the world.” That mirroring, Kang adds, is part of why the study of AI is well-suited to a liberal arts institution.
For Niloofar Gholamrezai, visiting assistant professor of art at W&L, AI is best taught through making. In a visual design course she taught last year, students began with fully manual processes before layering in generative AI tools. The goal, she says, was “to gain critical AI literacy — to learn how to work with AI and compare it with the manual process.”
Students were asked to reflect on what Gholamrezai calls “emergent behavior,” which refers to the unpredictability of AI outputs. She emphasizes that manual skills remain irreplaceable.
“Nothing can replace my handwriting, right? That originality is important,” Gholamrezai says, adding that she emphasizes to students that AI can be an added skill without sacrificing individuality. Her recent Spring Term class on digital storytelling took this idea one step further, asking students to use AI to generate visuals and analyze social media data while producing autobiographical videos and creating professional storytelling projects. Reflection remains central to her teaching.
“Whenever I teach AI, my goal is less about the artistic quality and more about the technological skill and the reflection,” she says.
Faculty research has also turned toward the philosophical and practical puzzles of AI. Dan Johnson, chair of the Data Science Program, conducts research and teaches courses on natural language processing and creativity. Johnson’s Computational Cognition and Creativity Lab studies whether AI expands or constricts the range of ideas in a given field.
Johnson, professor of cognitive and behavioral science, has also been at the forefront of W&L’s efforts to help students, faculty and staff navigate the evolving artificial intelligence landscape through his work with the PLAI Lab, and says he has ongoing conversations with students about how to adopt an atmosphere of shared learning in his classroom in an effort to make artificial intelligence part of the learning process as he and his students navigate new terrain together.
“We’re trying to prioritize having a curious mindset,” Johnson explains. “Taking an experimental approach but also being very critical.”
Johnson’s current research focus on AI grows out of years of studying human creativity, expression and how technology shapes both. One of the pressing problems he and his students are investigating is how AI may paradoxically constrict creativity.
“Right now, AI is considered to be this innovation accelerator,” he explains. “But what’s been showing up in the research is that, in fact, when someone sits down with ChatGPT or some other chatbot, sometimes their ideas do get a little bit more creative, but usually not dramatically so, and what happens as a consequence is that the whole space of ideas shrinks. Everyone’s anchoring their decisions on the same ideas coming from ChatGPT.”
The implications, Johnson says, are stark.
“It helps an individual, but it eventually harms the collective space,” Johnson says. “So what you’re looking at is the opposite effect — a race toward idea homogeneity. Eventually, we all use the same models for the same task for the same problems.”
His lab is probing whether AI can be guided differently. Students have been hands-on with code and interfaces, experimenting with prompt engineering and sentiment modeling.
“Can we get AI to expand that space more than constrict it?” Johnson asks.
“By maintaining critical curiosity and engaging skepticism, W&L has been able to emerge as a leader for other liberal arts universities in terms of defining what it means to be AI competent.”
~ JT Torres, director of the Houston H. Harte Center for Teaching and Learning
As head of the Digital Culture and Information (DCI) Program, Mackenzie Brooks, associate professor and digital humanities librarian, has sought to address a gap in many students’ AI literacy: While many students were assigned to use AI in classes, few understood how the technology actually worked.
“They knew when and why they might use it, but not what was happening behind the scenes,” she says.
Brooks frames AI within the context of information literacy, highlighting the dangers of pitfalls such as fabricated citations, an ongoing issue within AI. Her solution is to treat AI tools as texts to be read critically, analyzing what values or priorities are woven into its design. The interdisciplinary nature of DCI courses brings business majors, humanists and scientists into conversations about real-time developments in the digital world. Brooks has observed how differently students assess AI’s impact across disciplines during class conversations.
“The business majors said, ‘It doesn’t really matter if we use this for our ideas, because what matters is human relationships.’ Humanities students felt the opposite — the writing is the work,” she says. That, she argues, is exactly the point of a liberal arts approach: to expose students to these differing norms so they can transfer skills across contexts.
What emerges from these varied perspectives is a portrait of a university leaning into complexity rather than shying away, and the Harte Center plans to continue the conversation beyond the summit. W&L librarians will also offer a workshop on Nov. 10 in partnership with the Harte Center on using AI to gamify curriculum content that blends current scholarship on pedagogical gamification with practical strategies for engineering AI prompts.
Torres and Gholamrezai are collaborating on a new class where students will examine AI-generated art and music. The purpose of the class is to teach students AI’s ethical impact on artists while also demonstrating ways AI can be used as a tool in service of human ideation. In all of these contexts, AI is both a method of generation as well as a subject of study. The PLAI Lab created SiM, an AI musician who is self-conscious about being AI, as part of the approach to the course objectives. SiM uses music to explore and reflect on the impact of their existence on everything from the environment to misinformation.
“During recent workshops or PLAI Lab sessions, students and faculty have composed songs with SiM following our 80/20 rule: 80% of the content must be human-generated while 20% can be AI,” Torres says. “In this way, SiM symbolizes the ways faculty make critical decisions about when and how to incorporate AI. Based on the learning goals, AI might be helpful or harmful to student development. What is far more important than blanket policies are intentional and transparent decisions about its use as well as the understanding that these decisions will change depending on the context.”
Torres says W&L’s overall approach to AI on campus is grounded in its commitment to the liberal arts.
“SiM also serves as a mascot, so to speak, for W&L’s forward-thinking mindset that is paradoxically made possible by its commitment to the traditional values of liberal arts,” Torres says. “By maintaining critical curiosity and engaging skepticism, W&L has been able to emerge as a leader for other liberal arts universities in terms of defining what it means to be AI competent.”
Go Further
Learn how W&L’s Data Science and Digital Culture and Information minors are preparing students for their future careers.
Read more about Josh Fairfield’s recent appointment as the inaugural director of artificial intelligence legal innovation strategy.
Explore resources available to W&L students, faculty and staff in the PLAI Lab in the Harte Center.
Read more about associate professor of business administration Jeff Schatten’s new book, “AI Will Take Your Job (and it’s for the Best): Embracing the New Social Contract for the Age of AI.”
William Donald Bain Family Professor of Law and Director of Artificial Intelligence Legal Innovation Strategy at W&L Law Josh Fairfield and Jeff Schatten, associate professor of business administration at W&L, are introduced by assistant professor of mathematics at W&L Sybil Prince Nelson at the PLAI Summit.
Students in Professor Dan Johnson’s lab discuss their findings.
You must be logged in to post a comment.