We all know now that students’ ability to learn suffered significantly during the pandemic. Some kids struggled to pay attention and some just dropped out entirely, never to return. National test scores since then show that most students haven’t recovered even five years later.
Nevertheless, I guess I’d had it in the back of my mind that eventually those kids would recover or, worst case, even if those kids never really recovered at least the ones coming up behind them would return to form. Eventually things would get back on track.
Now I’m having second thoughts after reading a couple of articles about the impact of AI on student learning. If social media is a major distraction and waste of time for a lot of students, AI represent something even worse. It’s a short-circuiting of the need to learn important skills like reading, writing and math. Kids no longer have to do any of these things because there’s a free chatbot that can do all the work for them in no time at all. This is from an interview with Rebecca Winthrop from the Brookings Institution. This is what she’s seen lately in schools.
I’ve talked to kids all over the country. I’ve seen lots of incidents or cases of highly motivated, highly engaged kids who are using A.I. really well. They’ll write the paper themselves. They’ll go in and use A.I. for research and help them copy-edit. They’re doing the thinking, they’ve lined up the evidence to create a thesis, and they’ve presented it in logical order on their own…
But a lot of kids are using it to do exactly like you said: shortcut the assignments. An example, one high school kid I talked to said: For my essay, I break the prompt into three parts, run it through three different generative A.I. models, put it together, run it through three antiplagiarism checkers, and then I turn it in.
Another kid said: I run it through ChatGPT, then I run it through an A.I. humanizer, which goes in and puts typos in…
And that’s in line with what an NYU professor described in an article published last month. These are older kids in college obviously, but the pattern is the same.
Earlier this semester, an NYU professor told me how he had AI-proofed his assignments, only to have the students complain that the work was too hard. When he told them those were standard assignments, just worded so current AI would fail to answer them, they said he was interfering with their “learning styles.” A student asked for an extension, on the grounds that ChatGPT was down the day the assignment was due. Another said, about work on a problem set, “You’re asking me to go from point A to point B, why wouldn’t I use a car to get there?” And another, when asked about their largely AI-written work, replied, “Everyone is doing it.” Those are stories from a 15-minute conversation with a single professor…
Much of what’s driving student adoption is anxiety. In addition to the ordinary worries about academic performance, students feel time pressure from jobs, internships, or extracurriculars, and anxiety about GPA and transcripts for employers. It is difficult to say, “Here is a tool that can basically complete assignments for you, thus reducing anxiety and saving you 10 hours of work without eviscerating your GPA. By the way, don’t use it that way.” But for assignments to be meaningful, that sort of student self-restraint is critical.
Self restraint is necessary but do we really expect kids in Junior High or even High School to show that kind of self restraint? And for the kids that don’t, what happens to them? How many years of gaming the system can you do before it’s really short-circuited your entire education?
A 2024 study with the blunt title “Generative AI Can Harm Learning” found that “access to GPT-4 significantly improves performance … However, we additionally find that when access is subsequently taken away, students actually perform worse than those who never had access.” Another found that students who have access to a large language model overestimate how much they have learned. A 2025 study from Carnegie Mellon University and Microsoft Research concludes that higher confidence in gen AI is associated with less critical thinking. As with calculators, there will be many tasks where automation is more important than user comprehension, but for student work, a tool that improves the output but degrades the experience is a bad tradeoff.
There is of course a more positive, optimistic case for AI in schools. Maybe, like calculators, they will provide some disruption but not complete failure. Then again, reading and writing seem like pretty fundamental tasks for any white collar careers. And AI doesn’t just simplify doing problems it does the work for the students, all of it. If you can just take a photo of a math problem with your phone and it spits out the answer and all the steps in between, you’re probably not learning much.