
Just a few decades ago, scenes that seem familiar to us today would have been surprising: someone using GPS to get to a place they’ve already been or taking out their phone to “remember” a piece of information. Why would someone look up something they can memorize?
Today, most people think the exact opposite: why memorize a phone number when I can save hundreds on my phone? Why rack my brains trying to do a calculation that a calculator can do in seconds? Why learn something when I can look it up on Wikipedia or ask Artificial Intelligence? Or why pay attention to the details of the road when I can just use the GPS over and over again?
However, today we don’t just use technology to search for specific information; we increasingly depend on it for everything. This dependence has a name: cognitive crutches.
What are cognitive crutches?
Cognitive crutches are strategies or tools we use to delegate mental functions to something external, whether it’s a piece of paper as in the past, a device like a mobile phone, or a digital system like Artificial Intelligence. The aim is to free up mental resources to dedicate to other tasks.
Obviously, this phenomenon is not new; notebooks, shopping lists, and calculators have been part of our lives for a long time, but the arrival of the internet and, more recently, Artificial Intelligence, has elevated this practice to unprecedented levels.
What happens when we misuse technology?
The problem isn’t the tool itself, but how we use it. Using a digital calendar to remember appointments isn’t the same as constantly relying on a digital assistant for every step. Checking the exact date of a historical event isn’t the same as not knowing what happened. When technology stops being a complement and becomes a substitute for a basic mental process, the consequences are inevitable.
In 2011, researchers from Harvard, Columbia, and Wisconsin universities published an article in the journal Science in which they coined the term “Google effect” to refer to the fact that when we expect information to be readily available online, we tend not to remember it. In fact, it is also known as “digital amnesia.”
These researchers asked a group of people to read several sentences. Half were led to believe that the statements would be saved and available for later reference; the other half were not. When they had to write the sentences down, the people who thought they would be available showed worse memory.
Fast forward to 2025, and an experimental study conducted at the MIT Media Lab found that people who used generative AI to perform academic tasks (such as writing essays or answering questions) showed less neural activation and worse cognitive performance than those who worked more autonomously.
Furthermore, the associated results indicated a trend toward what researchers call cognitive laziness: reduced activation in brain regions involved in reasoning, memory, and deep learning. The authors concluded that “People who only worked with their brains showed the strongest and most distributed networks. Search engine users showed moderate engagement, and AI users showed the weakest connectivity.”
Their conclusion is that “cognitive activity decreases with the external use of tools.” This doesn’t mean that AI magically makes us “dumber,” but it does raise an important warning: when we rely too heavily on these systems to do the mental work that should be ours, the brain stops exercising certain cognitive skills. And that could lead, in the medium and long term, to their decline or even their loss.
Memory, critical thinking, and mental crutches
Science suggests that the constant use of external tools, whether the internet or AI assistants, influences our memory and thinking, having a real effect on our neural connections. Instead of striving to understand information and integrate it into our knowledge system, we simply ignore it because we know it will be readily available.
And it may be, but over-reliance on technology could limit the consolidation of neural networks that underpin long-term memory and critical thinking. Continuously using cognitive crutches often fosters a more superficial approach to information, so we don’t train cognitive processes like analysis, synthesis, generalization, or information retrieval.
Obviously, this isn’t about demonizing technology, but about learning to use these tools without relinquishing control to them. That means using the internet to learn, not merely as an external repository of information, and using AI while sharpening our critical thinking skills. Technology shouldn’t (and frankly, it can’t, no matter how much they try to convince us otherwise) replace basic mental processes.
Ultimately, the human brain has evolved to process, integrate, and create meaning. Delegating tasks can be helpful, but only as long as we maintain an active role.
References:
Kosmyna, N. et. Al. (2025) Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task. In: MIT Media Lab. arXiv preprint arXiv:2506.08872
Sparrow, B., Liu, J. & Wegner, D. M. (2011) Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips. Science; 333(6043): 776-778.




Leave a Reply