A Cognitive Decline Foretold: Your Brain in the Age of AI
Technology promises ease, but our brains need challenge to stay sharp. What happens when we outsource thinking itself? A psychologist explores the cognitive cost of convenience.
Let me tell you about a conversation I had last week. A colleague mentioned she couldn’t remember her own phone number anymore. We laughed about it—haha, technology, right?—but then she said something that stuck with me: “I don’t really remember anything anymore. My phone remembers for me.”
She wasn’t kidding.
As a psychologist, I spend a lot of time thinking about how we think. And lately, I’ve been wondering if we’re conducting the world’s largest uncontrolled experiment on human cognition—with ourselves as the unwitting subjects.
Recent research from MIT revealed something unsettling. When people used ChatGPT or other AI tools to write essays, brain scans showed dramatically reduced activity in networks tied to cognitive processing, attention, and creativity. Even more striking: right after submitting their work, participants couldn’t recall what they’d just written. The AI had done the work, but their brains had essentially checked out.
This isn’t a Luddite rant about evil computers. This is about understanding what happens to a brain that evolved over millennia to solve problems when we suddenly hand those problems to machines.
Here’s the catch: your brain is what psychologists call a “cognitive miser”—it’s fundamentally lazy. I say this with empathy, because it’s my brain too. Our brains evolved to conserve energy whenever possible—it’s basic survival strategy. Why climb the tree yourself when you can get someone else to grab the fruit? This served us beautifully for thousands of years. But now we’ve created technologies that exploit this tendency with ruthless efficiency for commercial profit.
And we’re starting to see the consequences. For most of the 20th century, IQ scores steadily increased—a phenomenon called the Flynn effect. But in developed countries, this trend has reversed for people born from the 1980s onward. Scandinavian military data shows it clearly, and researchers have linked this decline to changes in how we spend our time, particularly our increasing reliance on technology to do our cognitive heavy lifting.
The tech industry calls it “frictionless design,” which sounds lovely until you realize that friction is exactly what your brain needs to learn, remember, and grow. Every time you struggle to recall a fact, work through a problem, or synthesize competing ideas, you’re building neural pathways. You’re literally strengthening your brain’s architecture. Remove that friction, and those pathways never form.
Think about the last time you navigated somewhere without GPS. Can you even picture it? For most of us, that cognitive muscle has atrophied. We’ve outsourced spatial reasoning to Google Maps, and research shows our hippocampi1—the brain regions crucial for navigation and memory—may actually be shrinking as a result.
But navigation is just the beginning. We’re now outsourcing the act of thinking itself.
When you ask an AI chatbot a question, it anchors your brain to one particular answer, one way of approaching a problem. This “anchoring effect” isn’t new—it’s a well-documented cognitive bias. But AI supercharges it. Instead of wrestling with multiple perspectives or generating novel solutions, we accept the first plausible-sounding response. As one researcher put it: AI can help you build the best candle ever, but it will never lead you to invent the lightbulb.
I observe this in my own clinical practice. Young adults who grew up with smartphones describe feeling cognitively overwhelmed in ways that previous generations did not. They struggle with sustained attention, arithmetic, and long-form reading. They feel anxious when they are forced to sit with uncertainty instead of immediately searching for answers online. They describe their minds as ‘scattered’ and say that they experience what they call ‘brain rot’ — that fuzzy sensation of mental overload combined with an inability to focus deeply on anything.
What they’re experiencing is what MIT Media Lab researcher Linda Stone calls “continuous partial attention”—a state where we’re constantly scanning for stimuli but never fully engaging with anything. It’s the psychological equivalent of eating only junk food. Sure, you’re consuming something, but you’re not getting any nutrition.
The education system is struggling with this. Teachers report that discussions have become shallower. Students sit in group work staring at laptops instead of talking to each other. When assigned essays, many use AI to generate them, learn nothing, and panic when they reach real-world jobs where they actually need to think independently.
One teacher told me about a student who submitted a brilliant paper—far beyond their usual work. When questioned, the student admitted they’d used ChatGPT but seemed genuinely confused about why this was a problem. “I still had to tell it what to write,” they said. They didn’t understand that the entire point of writing isn’t to produce text—it’s to develop the capacity for complex thought.
This is what worries me most: we’re creating a generation who can produce output without developing competence. They can generate essays without learning to construct arguments. They can find answers without building knowledge frameworks. They look productive while their cognitive abilities quietly erode.
And here’s the uncomfortable truth: none of us are immune. Every time I reach for my phone to calculate a tip instead of doing mental math, every time I ask Siri a question I could answer myself if I thought for thirty seconds, I’m choosing convenience over cognitive exercise.
But let’s be clear about something: this isn’t just about individual choices. We’re being sold a bill of goods by an industry that has no intention of slowing down to check if their products might be rewiring human cognition in troubling ways.
AI companies are racing to embed their tools into every corner of our lives—our workplaces, schools, homes—with the urgency of a gold rush. They promise productivity, creativity, personalization. What they don’t mention is that we’re all guinea pigs in an experiment that has no control group and no informed consent form.
Think about it: when pharmaceutical companies develop a new drug, they spend years running trials, testing for side effects, proving efficacy. We demand this because we understand that messing with brain chemistry requires caution. But when tech companies roll out tools that fundamentally alter how we think, remember, and process information? They just... release them. No longitudinal studies on cognitive development. No rigorous testing on educational outcomes. No waiting to see if constant AI dependence might atrophy critical thinking skills the way extended bed rest atrophies muscles.
As one researcher pointed out, we wouldn’t accept a stranger in a bar handing us an untested pill and saying “this is great for your brain—just try it!” Yet we’re doing exactly that with AI tools deployed to millions of children whose brains are still developing.
The tech industry’s defense is predictable: these are just tools, and like any tool, they can be used well or poorly. But this sidesteps the fundamental question of whether we should be deploying these “tools” at scale before understanding their long-term effects on human psychology. It’s not fear-mongering to suggest that maybe—just maybe—we should pump the brakes and actually study what happens when an entire generation learns to outsource thinking during their formative years.
The market moves fast. Science moves slowly. And in that gap, we’re making irreversible decisions about human cognitive development.
So what do we do? Smash our devices and retreat to the woods to live off-grid? Tempting, but impractical.
In reality, there are no solutions, only trade-offs. We live in what some researchers call a “stupidogenic society”—one that makes it easy to be stupid, just as our obesogenic society makes it easy to be overweight. And just like with obesity, this is actually a “nice problem to have.” I’d rather live in a world with abundant food than one with famine. I’d rather have powerful AI than no technology at all.
But recognizing it’s a better problem doesn’t make it less of a problem.
Think about what happened when physical machines freed us from manual labor. Some people responded by taking up running, CrossFit, marathons—activities that would have seemed insane to previous generations who got more than enough physical exercise just surviving. Today, some people are probably fitter than any humans in history. Others struggle with obesity. We’ve created a massive range of physical fitness precisely because we’re no longer forced to be physically active. Bottom line, we still have a choice.
The same thing is happening with cognitive fitness. We’re no longer forced to do mental arithmetic, memorize directions, or recall information. Those basic skills used to be unavoidable—you needed them to function in daily life, to earn money, to avoid looking foolish. They were tedious, but they had useful by-products: they built pathways to more advanced skills and kept our mental muscles strong.
Now? The incentives have vanished. You have to reach much higher standards before your cognitive abilities add economic value. Daily life doesn’t force practice on you anymore. Fast food tills show pictures instead of requiring reading. Voice notes replace writing. Consumer software is so intuitive it needs no instructions.
Instead, we need to rebuild friction into our lives intentionally—not because we have to, but because we understand what we lose without it. Read actual books—the kind that require sustained attention. Navigate without GPS sometimes. Do math in your head. Have conversations without looking things up mid-discussion. Write something—anything—without AI assistance. Let yourself be bored occasionally instead of immediately reaching for stimulation.
Most importantly, we need to recognize that difficulty isn’t a bug in the system—it’s a feature of how learning works. When something feels hard, that’s often your brain doing exactly what it should be doing: building new connections, forming memories, developing capabilities.
The frictionless path is seductive. It promises efficiency, ease, answers without effort. But that’s not how human brains work. We’re not designed to be passive recipients of information. We’re designed to struggle, synthesize, and ultimately understand.
Maybe we’ll see a cognitive fitness movement emerge, the way running clubs and gyms proliferated after physical labor became optional. Perhaps people will pay for mental gymnasiums—places designed specifically to exercise cognitive skills that no longer have practical applications but remain essential for human flourishing. Some people will become sharper and more intellectually capable than any humans in history. Others won’t. And there will be a huge range in between.
Artificial intelligence isn’t making us stupid. However, if we allow it to do all our thinking for us and neglect to develop new skills, cognitive decline will be waiting around the corner, and we won’t be able to claim that we haven’t been warned.
Yes, humans have two hippocampi (singular hippocampus), one in each hemisphere of the brain. They’re located deep within the temporal lobes and are crucial for memory formation and spatial navigation.


