There’s a famous quote in “Dead Poets Society,” where Robin Williams tells his students, “Medicine, law, business, engineering — these are noble pursuits and necessary to sustain life. But poetry, beauty, romance, love — these are what we stay alive for.” Some might think it’s a cringy line, but I think he was right. Writing is not just a tool for getting grades or passing requirements; it is one of the few ways we discover what we believe, what we value and who we are. When we let algorithms generate our thoughts, we are not simply breaking a rule — we are depriving ourselves of one of the few opportunities college gives us to hear our own mind and find what it is that we stay alive for.
Walk through Hesburgh Library at 11 p.m. during midterms, and you’ll hear the same whispered question at almost every table: “Do you think Turnitin will catch it?” Not, “do I understand the reading?” or “did this assignment teach me anything?” But rather: “Will the software catch me if I used the software?”
This is, sadly, the state of academic life in the age of AI. A state where — even while paying staggering tuition costs for what is supposed to be a true higher education — we are more afraid of being caught using AI than we are of losing the ability to think without it.
As a Writing Center tutor, I see this every week: a student will sit across from me, open a beautifully structured paragraph on their laptop, and ask for help improving it. And sometimes — I can feel — there is no human voice in the words. The sentences are immaculate and sophisticated, but hollow; polished, but dead. A work that simulates thought but has no thinker behind it.
In those moments, I feel strangely helpless; I can help a writer clarify an argument or sharpen a thesis; I can also guide someone through a paragraph that’s messy, chaotic or half-baked. But I cannot “improve” a paragraph that has no mind behind it — I cannot mentor a ghost. What it comes down to is editing an algorithm’s performance — not students’ work. And what strikes me is not that the student tried to cheat but that they seem more anxious about the possibility of detection than about the possibility of never learning how to write in their own voice.
There’s an internalized equation where getting caught equals disaster and not learning is irrelevant. This fear, I believe, doesn’t come from laziness, but from the contradictory world we’ve created. One in which a “patchwork” of AI policies generate inconsistent expectations and foster a culture of surveillance that encourages students to think more like compliance officers than free learners.
As is widely known, fear has a way of “flattening” the imagination, and when the fear of punishment overshadows the desire to understand, something crucial is lost: a thinker. And ultimately, this culture of suspicion may produce rule-followers, but it cannot produce thinkers.
If Notre Dame’s mission is truly to form minds and hearts, then the greater threat is not AI. The greater threat is treating students as potential offenders rather than moral agents, and cultivating a generation of students who never get the chance to hear their own voice because they are too afraid to use it.
I recognize that many students don’t feel it’s worth their time to engage deeply with a writing-intensive or arts course when what they truly care about is elsewhere. But what will we stay alive for if poetry, beauty, romance and love become just more content generated by AI?








