Skip to Content, Navigation, or Footer.
Wednesday, Jan. 14, 2026
The Observer

AI_Screens.jpg

We don’t have to allow AI to do everything for us

This is the sixth installment of Meghan Sullivan’s series about DELTA, Notre Dame’s faith-based framework for a world of powerful AI.

In today’s column, we’re covering the fifth and final letter in the DELTA framework: A, for Agency.

Let’s face it: Human beings have never been superstars at making decisions or taking moral responsibility. There’s a reason why we got kicked out of the Garden of Eden. As flawed beings, we need to rely on our families, our churches and our schools to teach us how to cultivate and use our moral agency correctly — to know when we’ve done something wrong, when to ask for forgiveness, when we need to own our decisions and when it's okay to depend on someone else to decide for us. That’s what moral development means.

Moral development is hard work. To succeed, we need to be able to lock in and focus. And that’s getting more difficult all the time, because our capacity to pay attention is under siege.

In the last 20 years, the economic model for the internet and social media has been an economy of attention. You do a Google search and you look at an ad at the same time. You check out a video on YouTube or your favorite Pinterest influencer; more ads. Using algorithms that customize our social media and streaming feeds, corporations keep us scrolling in order to monetize our attention. This raises a series of really interesting moral questions that will be compounded as we move to an economy fueled by artificial intelligence that not only captures our attention, but makes decisions on our behalf.

The great promise of this new technology is that certain aspects of human work can be delegated to agentic AI systems and completed more efficiently. And sometimes that’s what we want: I love having AI give me a recipe based on the ingredients I have available in my refrigerator so that I don’t have to take a trip to the grocery store. But there are other times when we should choose the slower, less efficient option because that is how we exercise our conscience.

If you ask a friend to go to the gym in your place, you won’t improve your own strength. The hard choice bears the most fruit — not simply in the end result, but in the value of the labor itself. The philosopher Simone Weil talks about this in her essay “Reflections on the Right Use of School Studies in View of the Love of God.” The struggle with a particular calculus problem has value beyond finding the solution, because through this strenuous work, you build your capacity for attention.

In this new era of AI help-bots, we need to be very mindful of these types of choices. Generating a recipe for dinner may be a reasonable use of AI, but using AI to summarize the “Nicomachean Ethics” for a class assignment is not. Even though it requires you to give more of your time, energy and presence, the challenge of sustained effort is key to forming virtue and growing as a person.

Despite the narratives that we hear in the media, we don’t have to allow AI to do everything for us. You can choose when to use AI and when to not, and you should deliberate over that choice in light of what kind of person you want to become. There are many things that we absolutely should not pass off to AI — determining whether or not a civilian casualty is morally acceptable in battle, for example, or the loving, nurturing and deeply inefficient work of raising a child. Delineating those boundaries requires us to be reflective about what we ultimately value.

This work is difficult, and the proliferation of AI systems will only intensify the challenge. But the effort is worth it. In order to flourish as humans, we need to cultivate our attention and exercise our agency. Remember the words that Marcus Freeman uses to challenge and inspire his players: “Choose hard.”

Meghan Sullivan

Wilsey Family College Professor of Philosophy and director of University-wide Ethics Initiative and the Institute for Ethics and the Common Good

Dec. 8, 2025

The views expressed in this column are those of the author and not necessarily those of The Observer.