Can The Humanities Learn To Love AI?

Q&A With Alexa Alice Joubin

By John DiConsiglio

Can The Humanities Learn To Love AI? Q&A With Alexa Alice Joubin

Artificial intelligence (AI) isn’t just on its way to humanities classrooms—it’s already there! From students asking philosophy questions to ChatGPT to professors using AI platforms for sharpening writing and research skills, AI is transforming the humanities world every bit as much as computer science labs.

And despite fears that it may encourage cheating or erode basic skills, some humanities scholars are seizing AI’s classroom potential—like self-proclaimed AI ‟early adopter” Alexa Alice Joubin, professor of English, theater, international affairs, East Asian languages and cultures, and women’s, gender and sexuality studies.

Joubin has made AI a centerpiece of her scholarship. She’s an affiliate of the new Institute for Trustworthy AI in Law & Society (TRAILS) at GW, a founding co-director of the Digital Humanities Institute and an inaugural GW Public Interest Technology (PIT) Scholar.

In the classroom, Joubin has embraced AI as a technology tool that can be as instructive to the humanities as an encyclopedia—or the written word itself. In her courses, she uses AI platforms to help students learn how to ask quality questions, conduct in-depth research and refine their critical questioning skills. As a PIT scholar, Joubin is pioneering trustworthy AI projects, including creating an open-access AI tutor based on her own teaching model. And she also champions the technology’s potential to create a more inclusive classroom for international students who may struggle with English and students with varying learning needs. ‟It’s an empowering tool if you deploy it responsibly,” she says.

In a recent conversation, Joubin explained what AI can bring to the humanities landscape—and how humanities can help shape the future of AI.

Q: You describe yourself as an early adopter of AI in the classroom. How did you first become interested?

A: I’m very interested in the relationship between art and technology. Technology relies on art.

When you launch a new technology, you are telling a story, a narrative. There is technicity in art, and artistic imagination brings forth new technologies. And, of course, art needs technology. If you think about it, what is a quill pen? It’s a craft for writing—a technology. Technology is any application of conceptual knowledge for practical goals. As early as ancient Greece, people were dreaming of machines that could do things autonomously. And even in the 20th century, [mathematician] Alan Turing famously gave us the Turing Test on whether there is consciousness in the computer—and consciousness is a humanities question. So this didn’t start with ChatGPT. It’s one famous iteration over a long history.

When generative AI came along in late 2022, I was thrilled. I jumped on it right away. I was disappointed in the early days. But I’ve been steadily teaching with AI and urging my students to look at it realistically and critically. It’s not a devil and it’s not an angel. But AI is in our mix and it’s not going away.

Q: Where are we in the relationship between AI and the humanities?

A: AI really is a humanistic issue, and it has ignited broad interest in questions about free will, mind and body and moral agency. When people talk about ChatGPT, they talk about these questions. That’s why the humanities are front and center in this [debate]. Humanities provides a range of tools for people to think critically about our relationship to technology and about the so-called eternal questions. What makes us human? How do you define consciousness? These classic philosophical questions have gone mainstream thanks to all the debate about ChatGPT. Free will has suddenly become an important topic.

Q: How do you think the humanities world is adapting to AI? It seems that most people are either pro-AI or anti-AI—and the humanities largely fall into the anti-camp. Am I wrong?

A: Unfortunately, there seems to be a lot of fear and uncertainty. Even worse, there’s an indifference—a thinking that this has nothing to do with humanists. But it actually has everything to do with everyone in fields ranging from humanities to social sciences and theory. It is forcing us to pause and rethink some fundamental assumptions.

But technophobia, fear and indifference can lead to a shunning of AI. And that translates into an unhealthy classroom. We know students are using it. When they graduate, they are expected to have literacy in it. And writing, critical thinking and meta-cognition are becoming all the more central because of AI’s challenges. The bar is being raised.

Q: Can you give me an example of what AI technology can bring to the humanities classroom?

A: It can bring a level of self-awareness, because AI is a social simulation machine. It cannot create new knowledge, but it’s a repository of social attitudes. I teach my students to treat it like a shadow image of society. It allows you to think at a meta level about your role in a society and how society reacts to certain things. For example, when I teach ‟Romeo and Juliet” in my drama class, students invariably have ideas about performing the play in a modern setting. AI can generate visuals for the scenes they describe in their heads. But students often come back to me and say: Why are Romeo and Juliet always white? Why aren’t they Black or Latinx or a queer couple? It forces them to rethink how they phrase their questions and their default assumption. It’s an extremely fun and eye-opening exercise, but it also helps us examine our unspoken, unconscious racism or sexism.

Q: As a new PIT Scholar, one of your priorities has been to explore issues around trustworthy AI. How do you see humanities contributing to that conversation?

A: How do you build trust? That’s fundamentally a humanistic question. And there are many ways to define it—transparency, ethics, accountability, interpretability. Humanities is particularly good at exploring these critical theories in complex domains that deal with open-endedness. They require agile thinking. You have to be dynamic and always assessing and reassessing the context. Humanities scholars know that there’s no single universal morality. It depends on perspective. And a key humanities contribution is the ability to entertain ambiguity and multiple perspectives at once.

Alexa Alice Joubin

“AI really is a humanistic issue, and it has ignited broad interest in questions about free will, mind and body and moral agency … That’s why the humanities are front and center in this [debate].”

Alexa Alice Joubin 

Q&A with Dean Alyssa Ayres

Photo Alyssa Ayres
Photo Alyssa Ayres

What drew you to the Elliott School and this position?
I’d taught at the Elliott School a couple years ago, and admired the school’s academic excellence and real-world impact on educating policy leaders. The school’s mission, with its emphases on education, research, and public engagement, speaks to the different parts of my life. When the deanship opened up at Elliott in 2020, I was drawn to the opportunity to be part of such an accomplished school of international affairs, a community of scholars thinking about these issues, and at a time when foreign policy and national security concerns are front and center in our lives. 

What are your top priorities for your first year as dean?
My number one priority is to get to know the school—get to know the faculty, staff, and students, and your top ambitions and concerns! I am also keen to think through with all of you how the Elliott School can build on its great strengths across the disciplines and take the lead on emerging foreign policy trends like the expansion of new actors in international affairs, or how best to organize our work on complex issues like climate, global health, and cybersecurity. I really look forward to conversations with the Elliott community on these and so many other issues. 

If you had to pinpoint a childhood experience that sparked your interest in international affairs, what would it be?
This is a very long story. I was originally on the engineering track. But I did a semester abroad program in India during college, and that shaped every single step from there. It’s one reason I continue to advocate for study abroad programs, and most especially, for programs that are a little less-traveled. Please ask me for my slide on study abroad destinations! 

What achievement are you most proud of in your career to date?
I am most proud to have been part of the team at the State Department that staffed the early years of the U.S.-India Strategic Dialogue, a whole-of-government cabinet-level consultation created by then-Secretary of State Hillary Clinton. It continues to exist although it has gone through two format revisions; it provides an anchor on the diplomatic calendar with an important country that is not a formal U.S. ally. 

What are some of the qualities you admire in leaders you have worked with and why?
Transparency and collegiality, because that makes it so much easier to get through (inevitable) challenges together. 

Any advice for Elliott students going on their second virtual semester?
This is hard, and we are collectively living through something unprecedented in our lifetimes. While we all keep buckled down for the months ahead, it’s still important to take breaks, and step away from the screen sometimes. (This advice will be a lot nicer to follow once the season changes to spring.) 

And finally, what’s the best advice you’ve ever been given?
Start saving for retirement early. I promise you it is important!