LLMs Eliminate the Learning Space and Impede Critical Thinking for Young Writers
Critical thinking is hard work, and AI, while it is making life easier for industry and students, it is taking the grind out of the process.

Recently, I asked a friend of mine some business advice. She returned to me what I immediately knew was canned advice from Chat GPT. When I called her on it- she said it was objective. I beg to differ.
LLMs are generating content at a seemingly unrecognizable rate, and the result is “shallow and biased perspectives” that simply lack deep engagement and deeper reflection. LLMs- while responsible for enhancing learning in some capactities- are quite possibly robbing students, and adults, from cognitive development, critical thinking and intellectual independence according to a recent MIT study, “Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task.”
At Thanksgiving dinner, I chatted with my nephew, a junior at West Virginia University about how he is seeing his peers using AI as college students. He shared that some use it as a tool for studying, planning, etc., but he knows some who use it “for everything they do.” While I ponder this fact daily as an academic advisor, his words resonated with me this weekend. How is AI changing the landscape of learning for those developing as learners?
The MIT study shows that the functions of learning matter, and getting into the weeds with writing is where the learning happens.
When Chat GPT does the thinking for learners, it takes them out of the unknown and into the “thinking” of the LLM where few students, if any, deviate from the line of reasoning created by AI. Without this second guessing, it leads to a biased perspective that shapes the user experience creating an echo chamber.
The MIT study shares, “Only a few participants in the interviews mentioned that they did not follow the “thinking” aspect of the LLMs and pursued their line of ideation and thinking.” They accept the LLM at face value.
Finding the answers via AI is undoubtedly easier with less “friction,” but it comes at a cost with answers being curated to fit user expectations. A Microsoft and Carnegie Mellon research study found similar results, “the more you trust AI's abilities, the less likely you are to think critically about its outputs,” journalist Charles-Towers Clark reminded readers.
So why does this matter? AI is growing at an exponential rate, and while this is helping solve many issues in industry, what does that mean for those learning how to learn?
Industry professionals- while engaging with AI- have the wherewithal to question and challenge AI driven outputs. I experienced this first hand at an AI conference at a university where participants asked industry leaders to rethink their mission statement which said users "should not be skeptical or scared of Artifical Intelligence." These leaders said all strong thinkers are, and should be, skeptical. Being skeptical is not a negative, they reasoned. It is a necessity.
I recently put my skepticism to the test with a business plan I am creating. After AI designed the content for slides for an upcoming luncheon, I used that content in a second generator on Canva to build out a presentation. The output- while impressive- was less than optimal and not something I could use in the field. As someone who has designed numerous presentations, I could discern its subpar output. However, based on where you are at in the process- the use and the dependence on AI can mean interrupted and disrupted learning. It can bring skepticism and question, or it can mean blind acceptance.
Simply put- students lean into AI more than they lean into their own abilities.
Critical thinking is hard work, and AI, while it is making life easier for industry and students, it is taking the grind out of the process. The Carnegie and Microsoft study confirm, “For most cognitive activities (knowledge, comprehension, application, analysis, synthesis and evaluation), knowledge workers reported that generative AI had reduced effort.”
With reduced effort comes reduced content, clarity, and learning for our youngest learners who are discovering what it means to learn.
In a viral video on social media, Duke alum, Dr. Becky Kennedy, describes “the learning space.” It is the area between knowing and not knowing often identified with the feeling of frustration. It is in the frustration and the figuring that Kennedy reinforces that learning is taking place.
I have seen this space first hand for over twenty years as I teach writing. First through formulaic writing, we show students how to build an argument and then we show them how to break the formula and design their own pace and structure. It is in this frustration that they grow not only in skill but in confidence.
Writing is critical thinking on paper. They question, they argue, and they defend their thinking with personal anecdote and factual evidence. There is deep engagement and with that deep engagement- learning thrives.
One can argue that LLMs race through the learning space and cut frustration. What seems like a win can actually be the issue. Students are relieved when a beautifully crafted essay is on their tablet in 30 seconds flat, but as a result critical thinking and intellectual independence evaporate, and what is left is a shallow, biased perspective that cuts thinking and robs a student the chance to create, think, and grow.
While AI is at the forefront of much discussion, as parents, educators, and learners- we must continue to ask ourselves how and why we are using it. Are we robbing ourselves of the learning space? Are we circumventing critical thinking? Are we eliminating an opportunity to struggle, grow, and ultimately learn?











