The Office of the Provost, Notre Dame Learning and the Lab for AI in Teaching and Learning (LAITL) hosted a faculty and student summit on artificial intelligence at the IDEA Center on Friday afternoon.
According to the Technology and Digital Studies Program, the purpose of the summit was to “build mutual trust between faculty and students around the evolving role of AI in education ... clarify assumptions about the purpose and appropriate use of AI in teaching, learning, and assessment ... understand what’s working and what’s not from both faculty and student perspectives, and explore the ‘why’ behind those perception ... [and] reflect on the meaning of a Notre Dame degree in an AI-driven world.”
G. Alex Ambrose, the director of LAITL, said, “Eighty-eight people attended. So, I think there’s something to say about faculty and students wanting to be together.”
The summit is the first event of its kind at Notre Dame. However, town hall meetings with faculty members and undergraduate research panels with students and faculty were facilitated in the past, according to Ambrose.
“Events like today, bringing students and conversations across colleges are important,” Ambrose said.
Introductory remarks from the summit program coordinator, Megan Rogers ’22 and Ron Metoyer, vice president and associate provost for teaching and learning, began the summit.
Rogers explained that the summit “addresses the use of AI in education but also long-standing problems like how to incorporate the voices of students and faculty directly into institutional decision making.”
The summit was structured with three rounds of personal reflection and group discussions and two rounds of ideation on possible solutions to mitigate AI concerns. Faculty and students were grouped by major or department, with an average of two faculty and three students per table.
A digital workbook guided participants through answering questions and encouraged deeper analysis to identify their personal perspectives on AI in education.
Some examples of these guiding questions were, “What do you truly want to gain from your experience at Notre Dame?” “When was the last time you walked away from a course feeling like you truly learned?” and “What is your biggest fear when it comes to AI and education?”
After this, tables held group discussions with participants sharing a diverse range of perspectives on the role and purpose of AI in education.
“Many people use AI as a critical topic to blur the fact that we have similar needs we want to fulfill, and I think that seeing through that fog is what’s really important,” sophomore computer science student Kristofer Ulanday said.
Ulanday also said that the computer science and engineering department “had positive thoughts on the usefulness of AI for already known knowledge and repetitive tasks, but had overwhelmingly negative thoughts on how AI removes the learning process.”
Marianne Cusato, director of Notre Dame’s Housing and Community Regeneration Initiative, shared concerns about the usage of AI in social and academic contexts.
“I think it’s going to make us worse at critical thinking, relationships and interacting with other people,” she said. “So the risk is that we lose our ability to think and we lose our ability to understand reality and relate to each other.”
Freshman Sophia Edels, double majoring in marketing and finance, also shared her thoughts.
“I’ve seen a lot of ways that AI is a resource, but also has negative implications,” Edels said. “In my writing in the age of AI class, we’re exploring the ways that AI can enhance writing and multimedia.”
“The changes from AI are inevitable, but the education field and learning process are always changing as well,” Edels added.
Cusato discussed the changes to her curriculum and her grading procedures for her classes.
“I’m changing how I teach seminar courses by not assigning papers, because I’ve conducted informal polling of students, revealing that many of them would have a paper written by AI,” she said. “So we have to be creative in how we’re engaging and make sure that from the faculty point of view, we’re structuring our courses in a way that is engaging, meaningful and connecting with students.”
Ulanday discussed the importance of personal accountability when using AI.
“Think about AI’s use cases and how it can benefit you, but, most prominently, take personal responsibility for your learning and your relationship with the world,” he said. “AI is a tool that should expedite monotony, not learning. As a student, it’s both exciting and terrifying to see the effects of both outcomes daily.”
Ambrose advised students to double-check AI policies with professors to ensure appropriate use.
“Make sure to think very personally and ethically with professors to figure out where AI should and should not belong, if, when, and how it should complement learning,” he suggested.
At the close of the summit, Rogers said she hopes that the ideas shared by faculty and students can be passed on to various departments and decision makers.
“I think Notre Dame is a very well-positioned school to tackle this challenge because of our focus on ethics and about doing the right thing, students doing the right thing and faculty doing right by their students. I have so much hope that Notre Dame will be able to figure this out and then we can share it with other schools,” Rogers said.








