970x125
Human agency is the ability to direct our own thinking. It’s not just a philosophical construct; it’s the everyday act of deciding what to learn, what to believe, and when to stop and think before accepting an answer. Agency is what keeps us from running on cognitive autopilot.
Artificial intelligence now offers to do much of that work for us. With a single prompt, we can receive elegant summaries and polished solutions that are so smooth and immediate that they can (and often do) lull us into submission. If we aren’t careful, we risk becoming passengers in our own intellectual journey, letting the machine set the course.
But I think there’s another way to see this. AI can also be the ultimate training partner, a sparring partner for the mind that forces us to level up. The essential challenge is to keep agency at the center of the interaction and to make sure we are directing the conversation rather than outsourcing it.
Agency as the New Literacy
Critical thinking has always been the backbone of education, but in the age of AI, it needs a serious upgrade. It’s no longer enough to simply “check sources” or avoid misinformation. Agency means going further to help us decide which questions matter, shaping the direction of inquiry, and perhaps most importantly, staying engaged even when the machine’s answer seems more than good enough.
Drexel University professor Michael Wagner recently offered a helpful framework in his blog by presenting the “Four Lenses of Critical Engagement.”
- Critical Reading. Looking past the surface of text to understand the “algorithmic curation” behind what we see.
- Critical Listening. Questioning the voices we hear, whether human or synthetic, and noticing how tone and rhetoric influence us.
- Critical Seeing. Recognizing how images and data visualizations can persuade or mislead.
- Critical Making. Creating content ourselves and reflecting on how the tools we use shape our output.
These aren’t academic exercises; they’re survival skills. They are how we keep authorship of our thoughts in an era when AI can generate the brilliant illusion of cognitive theater.
Iterative Intelligence and Learner-Centricity
This challenge sits at the heart of what I’ve called iterative intelligence. It’s the ability to learn, test, refine, and learn again in a dynamic loop. AI excels at iteration, but the learner must stay in control of the loop. And that’s where agency becomes essential.
Education in the AI era should be learner-centric, not machine-centric. The best question isn’t “What can AI do for me?” but “What do I want to think about and how can AI help me think about it better?” When a student uses AI to explore multiple perspectives on a problem, ask “what if?” questions, and challenge the outputs they receive, they are exercising agency. When they simply accept the first answer, they surrender it.
The Risk of Cognitive Abundance
There’s also a fascinating paradox here. We live in a time of cognitive abundance—a term that is often leveraged as a transformative outcome—and never before has it been so easy to access so many ideas so quickly. But abundance can have a dulling effect. When knowledge is cheap, we can lose the desire to search for it and, in the final analysis, make it our own.
The key point here is that agency is the antidote. It’s what turns abundance into opportunity rather than overwhelming us. It’s the skill that keeps learning active rather than passive, and thinking uniquely generative rather than derivative.
The Seduction of the Asymptote
AI doesn’t just answer our questions; it just moves closer and closer to sounding exactly like us. And that’s worth reading again. Each iteration brings it nearer to the curve of human thought, so close that the difference becomes nearly imperceptible.
I still remember making amyl acetate in organic chemistry class years ago. It’s a simple synthetic reaction where you mix amyl alcohol and acetic acid, and you get a clear liquid that smells exactly like bananas. If you tasted it, you’d swear it was bananas, but of course, it wasn’t. So close, yet so far away.
AI creates the same freakish effect. It produces language so natural, so perfectly human-like, that we can forget it isn’t human. That’s the seduction of the asymptote—the closer we get to perfect imitation, the more tempting it becomes to stop noticing the gap at all.
A Call to Educators, Parents, and Innovators
For educators, the task is to teach students not just how to prompt an AI system, but how to stay awake while using it, and that means how to pause, question, and take control of the process. For parents, it means guiding children to see AI as a tool for exploration and not just a convenient shortcut to completion. And for innovators and technologists, it means designing systems that encourage reflection rather than merely delivering instant gratification. And in today’s techno-world, that’s easier said than done.
It’s essential to recognize that agency isn’t something AI can give us or take away from us, but it can make us forget to use it. And that, I believe, is the most profound risk of all.
Staying the Author of Our Minds
Human agency is the quiet, perhaps even magical force that keeps us the authors of our own minds. AI doesn’t have to erode that force. But it will, if we fail to exercise it. The age of AI can be an age of unprecedented human growth, but only if we meet the machine head-on with deliberate, active thought.
In the end, agency may be the most important literacy of all. And that’s not because AI is so powerful, but because we are.