Skip to main content

Doubt & Dialogue: What Happens When Faculty Explore AI Together

Close up of Benjamin Franklin statue, awash in amber light.

June 25, 2025 by Erin Bartnett

When Cam Grey of Classics walked into the first session of the CETLI Seminar, Using AI Productively in Your Teaching, he thought to himself, “I'm going to hate this.” 

It was Fall 2024, and Generative AI was presenting new questions for student learning. Grey was suspicious of the way students engaged with AI tools. First, there were the concerns he had for learning in his discipline: Classics is a notoriously old field, with ample examples of outdated and even problematic scholarship. Most of it is publicly available and has been used to train ChatGPT, so when students use ChatGPT or other LLMs that have been trained on that data to generate their essays or support their learning, it can provide misleading information.  Beyond his discipline, Grey was generally concerned with how to protect the process of learning when the act of writing—learning how to think through ideas and then express them in language—is the emphasis of the work students do in class. But Grey also knew AI wasn’t going anywhere, and he felt compelled to “face [his] fear” and better understand its role in his teaching moving forward. 

On the first day of the seminar, Grey recalled sitting around the table with the other members of the inaugural AI cohort— faculty from SAS, Design, Law, Nursing, Wharton, and SEAS—when Rachel Hoke and Cathy Turner introduced the first exercise.  

They were asked to write a recipe and then prompt ChatGPT to do the same. As a group, they evaluated the results. The experienced chefs in the room had way more to critique—notes on the ingredient list, cook time, oven temperatures—so that the group began to see that the effectiveness of the tool was defined by the expertise a user brought to the experience.  

For Grey, this was a refreshing way to think about AI. He was energized by the conversations he had with his peers. “When you're in a conversation with smart people who are thinking about this stuff in thoughtful ways, it stimulates all kinds of ideas.”  

Seminar Conversations

The seminar helped me to think through the fact that this isn't about teaching with or teaching against AI. It's about how we can use this tool alongside all the other tools we have, to do what we're trying to do, which is teach our students the skills that we want them to learn.
Cam Grey
Classics, School of Arts & Sciences

Clarifying the Purpose for AI Use

One of his peers, Joshua Mosley of the Weitzman School of Design, joined the seminar to think through how the adoption of new technology affects how we teach. For Mosley, AI felt like “a bigger social and teaching change,” than other tools he had adopted in the past. 

He noticed the social change in students’ emails. Where a student used to write a couple sentences to explain why they were missing class or needed an extension, Mosley began to receive long, polished emails that were clearly written by AI. “I was missing the opportunity to get to know them through their writing and felt this was working alongside students as their personas developed,” he says. 

As for its impact on his teaching, he entered the CETLI seminar curious about “whether other Penn faculty would also be concerned about losing basic skills because of the acceptance of AI.” He wondered: “If we accept that AI will erode ambition for head-on challenges and well-sourced answers in favor of quick semi-effective arguments —will that be the new reality?” The seminar became a space to hear about other faculty’s perspectives on why that couldn’t work—including Cam Grey’s position on Classics scholarship.

At the same time, Mosley experimented with using it in his course. For example, he had his students work with AI to see if it could help them develop meaningful story ideas for animations. In the past, he would sit with students as they struggled to define their projects. In the experiment, Mosley and the student would work together to use AI to mine and explore additional references, details, or specific historical moments and locations could add depth to their stories. 

The seminar provided the opportunity to reflect on that learning in real time. “It was nice to have that seminar while I was testing to see which parts of AI were effective and which parts were not effective,” he says. “I have a sharper idea of when to use it when and when it's not going to be the most efficient way to use my time to solve some kind of problem.” 

It was nice to have that seminar while I was testing to see which parts of AI were effective and which parts were not effective.
Joshua Mosley
Weitzman School of Design

Working through the Ethics of AI Use 

Grey also began to experiment in the classroom.  

He brought back old school learning tools and experimented with new ones. For example, he had students write in class, using the tried-and-true Blue Book, but he also experimented with a new assignment that directly incorporated AI. In Decline and Fall of the Roman Empire, which Grey co-taught with Kim Bowes, Grey and Bowes had students generate an essay using AI. Then, students had to critique and edit that same essay. It was an opportunity for students to see that they had already garnered some expertise, and they could leverage that expertise to enhance their use of AI tools when appropriate. He also started using Perusall, a collaborative reading tool. “That wouldn't have been a tool that I would have thought of if I hadn't been really focused on how we get students to go through an intellectual process rather than just generate a product at the end.” 

The seminar conversations, which stretched beyond whether or not to use AI in teaching, helped him think through more nuanced ideas about the ethical use of AI tools for student learning. “A lot of our conversation really was about how we encourage our students to think as ethical beings and as learners about their engagement with this tool.” 

Indeed, this emphasis on designing assignments for the learning process rather than the product was one of the values of the seminar for Grey. “The seminar helped me to think through the fact that this isn't about teaching with or teaching against AI. It's about how we can use this tool alongside all the other tools we have, to do what we're trying to do, which is teach our students the skills that we want them to learn.” 

What I loved about [the seminar] was that around the table, there was such a broad diversity of entry points, comfort levels, and facility with the [AI] tools.
Cam Grey
Classics

Preparing for What Comes Next

Was the seminar a conversion experience? Did Grey or Mosley decide to embrace AI in their classrooms?  

“I'm still AI cautious, and I'm still AI skeptical. But shockingly,” Grey says, smiling, “this is not a binary on/off, yes/no, good/bad kind of undertaking.” 

 “I don't even think I'm now thinking about it as an ‘Oh no, this is something that we have to live with,’” Grey says. “I think I'm more acclimated to the idea that there are always new tools that are becoming available for us to help our students to learn. It's on them to be ethical in their ways of learning, and it's on us to help them to be ethical in their ways of learning. But being more efficient, being more effective isn't necessarily not being ethical.” 

One of the ways he plans to continue this work in the next academic year is to have a clear and purposeful AI Policy in his course syllabi and regularly communicate about his stance on AI with his students.  

His peers in the seminar had a wide range of perspectives on AI, and this helped Grey clarify his own thinking: “What I loved about [the seminar] was that around the table, there was such a broad diversity of entry points, comfort levels, and facility with the tool.” While Cathy and Rachel offered entry points for conversation, the seminar group also generated their own. And what united and energized the conversation was a shared investment in student learning. 

For Mosley, the experiments with AI in the classroom also helped him clarify his learning goals. “I think the AI work I did this year is probably pushing me into looking for ways to have more of a human-to-human, and human-to-material kind of connection.” He’s planning to return to more hand drawing with paper and pencils.  

Working through when AI is helpful and when it isn’t, has also become an opportunity for Mosley to connect with students. Sometimes, when a student starts working with AI, it leads them in the wrong direction. But it also gives them an opportunity to meet with Mosley and discuss why their question prompted a different recommendation. 

The seminar helped model the conversations and connections he was interested in building in his classroom. It wasn’t only a seminar about AI, “it was also a great experience observing how to teach,” Mosley says. “It's rare to be able to meet people like this outside of your school and share all of the doubts, concerns, and the excitement about what you’ve discovered is possible.”