All of us, more or less, are cognitive misers. We live in a complex and uncertain world that is constantly changing. Every day we are faced with so many stimuli and there are so many variables to consider, that it is perfectly understandable for our brain to take shortcuts and select the information that best suits our beliefs. Thus we do not have to make a great mental effort. However, that kind of mental laziness has consequences. And they are not exactly positive.
What is Cognitive Miserliness?
In 1984, psychologists Susan Fiske and Shelley Taylor made reference to the concept of cognitive miserliness for the first time. They used it to define “Those people who have a limited capacity to process information, so they take shortcuts whenever they can.”
However, the truth is that we are all cognitive misers sometimes since we have a tendency to choose the shortest paths in our day to day life. Rather than behaving like rational scientists carefully weighing the costs and benefits of the options, testing hypotheses, or updating our expectations and conclusions based on the results, we simply indulge in cognitive laziness and choose the easy path.
Obviously, we are more likely to use mental shortcuts when we are faced with uncertain and complex situations or when we have little knowledge about what is happening. In those cases, we try to simplify the problem. We are guided by a basic principle: save as much mental energy as possible, even in those situations where it is most necessary to “use our head”.
The path that cognitive misers walk
Cognitive misers tend to act in two ways: ignoring some of the information to reduce their cognitive load or overestimating some type of data, so that they do not have to search for or process different information that could destroy their beliefs or assumptions. Therefore, they are particularly prone to confirmation bias.
In practice, the cognitive misers have the tendency to seek, focus and favor information that confirms their beliefs or hypotheses, giving an excessive value to those data, while ignoring the details that can destroy their ideas, simply because it implies a greater mental effort.
The cognitive misers, instead of searching among all the relevant evidence for their problem or the decision they must make, they focus on that information that supports their hypothesis or initial alternative, ignoring or diminishing the value of the contrary or discordant data. Therefore, they start a process of partial search of information that prevents them from seeing the problem holistically.
They also tend to interpret information in a biased way, giving more relevance to the data that support their theories and worldview. As a result of this non-rational thinking, it is not difficult for them to construct poorly adaptive mental schemes that do not correspond to reality or develop stereotypes that become self-limiting.
The consequences of cognitive miserliness
Thinking little makes us less rational and more prone to falling into the traps of stereotypes and prejudices. This knowledge deficit and, above all, motivated ignorance, give rise to a biased and not very rational vision of the world that prevents us from behaving adaptively.
Taking mental shortcuts can be convenient when we are walking down the street since our mind is not capable of processing all the stimuli that come to us, but doing so in front of important and complex problems causes us to make bad decisions.
When we are not able to form a general idea of the problem we are facing and we see it in a biased and polarized way, we are likely to ignore relevant variables and make hasty decisions that we later regret.
Another effect of cognitive miserliness is that it diminishes our ability to correctly assess risks. When we apply cognitive shortcuts we neglect important data, little signals that help us understand how a series of errors can lead to a catastrophe. As a result of that cognitive blindness, we are less likely to learn our lesson for the future.
Enclosed in the echo chamber that we have built ourselves, we do not see the world clearly, but we limit ourselves to reinforce our beliefs and stereotypes, keeping them in a closed system safe from refutation.
Stop being a cognitive miser
In 2013, researchers from the French National Center for Scientific Research posed this problem to 248 university students: “A bat and a ball together cost $ 1.10. The bat costs $ 1 more than the ball. How much does the ball cost? “
Without thinking too much, most of the participants responded that the bat costs $ 1 and the ball costs 10 cents. But it’s wrong! The ball costs 5 cents and the bat costs $ 1.05.
79% of the participants took a mental shortcut. They didn’t make the effort of thinking and doing that little math operation. The curious thing, however, is that most people admitted not being sure of their answer. In a way, they knew they had behaved like cognitive misers.
In real life, these cognitive shortcuts are often harder to spot, but our intuition is worth paying more attention to. If we are not very sure of an important decision that we have made too lightly, it is probably our unconscious warning us that we have been cognitive misers.
Another way to get around mental shortcuts is to stop and ask ourselves if we have really evaluated all possible variables or if we have analyzed the situation with an open mind. Fiske explained that when we are worried or distracted, we have less mental space to think carefully. On the contrary, when we resume our routines and feel calm, we tend to think in a more rational, cautious and open way.
In any case, we must be aware that mental shortcuts can be both rational or irrational. They are rational when they help us make quick decisions in emergency contexts, but they are irrational when they push us to ignore all the information that contradicts our point of view and helps us form a more faithful image of reality in situations in which we have enough time to reflect on our next steps.
We must not forget that “Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons”, as Michael Shermer said.
Fiske, S. T. & Taylor, S. E. (2013) Social cognition: From brain to culture. Londres: Sage.
De Neys, W. et. Al. (2013) Bats, balls, and substitution sensitivity: cognitive misers are no happy fools. Psychon Bull Rev; 20(2): 269-273.
Corcoran, K. & Mussweiler, T. (2010) The cognitive miser’s perspective: Social comparison as a heuristic in self-judgements. European Review of Social Psychology; 21(1): 78-113.