A university lecturer in the U.S. says more and more students are turning to ChatGPT to write their essays—even ones about the ethics of AI. That’s raising big questions about whether students are still learning how to think critically.
Dr. Jocelyn Leitzinger, who teaches at the University of Illinois, noticed something alarming last semester: almost half of her 180 students misused the tool. Some essays even featured oddly generic names like “Sally” in personal stories—clear signs that AI might’ve done the heavy lifting.
And it’s not just her observation. A recent preprint study out of MIT backs this up. In a small experiment with 54 adult learners, those who used ChatGPT wrote weaker essays and showed less brain activity, according to EEG readings.
Here’s the kicker: 80% of the people who got help from AI couldn’t remember what they wrote. Meanwhile, those who wrote everything themselves—no AI involved—did better on comprehension and showed more mental engagement.
Still, researchers warn against jumping to conclusions. Despite the headlines calling ChatGPT a “lazy-maker,” they say we need more solid research before drawing any big lessons about AI and learning.
Teachers have their own concerns. ChatGPT’s writing might sound slick, but it often lacks real depth or fresh thinking. One student even admitted to using it for brainstorming and summarizing lectures—but said writing the actual essay was still his job.
Dr. Leitzinger puts it simply: when students rely on AI to write for them, they skip a key part of learning. “Writing is thinking, thinking is writing,” she said. “If we cut out that process, what happens to our ability to think?”