
We like to think that we are rational and reasonable people. Logicals. Objectives. That belief, however, can play against us. It can make us think that we have absolute reason, a REASON in capital letters that refuses to accept any argument and that is actually more unreasonable.
Julia Galef warned us: “Do you want to defend your own beliefs or do you want to see the world as clearly as possible? Because sometimes it is not possible to do both.” She was referring to one of the most dangerous cognitive biases we can experience: motivated reasoning.
What is Motivated Reasoning?
Motivated reasoning is a bias by which our desires, beliefs, fears and unconscious motivations shape the way we interpret the facts. It is the tendency to adjust reality to what we already know and reject those arguments or facts that go against our convictions, beliefs and ideas.
It is an unconscious trend by which we adjust the way we process information to the conclusions we had previously drawn, to adapt them to our belief system. As a result, we lose objectivity: we assume some pieces of information as our allies, defending them with cape and sword; while we perceive those who do not match our vision as an enemy to tear down.
The motivated reasoning trap and the intellectual laziness
In the 1950s, psychologists at Princeton University asked a group of students from two universities to watch a recording that showed a set of controversial arbitration decisions during a football match between the teams of their respective schools.
After viewing, students were more likely to perceive the referee’s decisions as correct when they favored their university team, but when they benefited their rival they tended to classify them as incorrect. The researchers concluded that the students’ emotional interest and their sense of belonging to the university shaped the way they analyzed the game.
That biased vision extends to all spheres of our life. Our judgment is influenced by the part we want to win, and that applies to everything that touches us closely. It influences what we think about our health and relationships, determines who to vote for or even what we consider fair or not.
If we do not believe in climate change, we will devalue all the studies that show that the planet is suffering from our actions. If we like to drink a lot of coffee, we will devalue the studies that indicate it is harmful. If we do not believe in trascendental meditation, we will reject studies that point to its benefits. And so on … Ad infinitum.
In practice, we process the information in a way that fits our previous beliefs and desires, to maintain the internal status quo and not be forced to change. If they show us evidence that goes against our beliefs, we are less thorough when it comes to analyzing them and we are even likely to banish them from our mind.
In fact, perhaps on more than one occasion, while we were reasoning with a person, we acknowledged to be wrong and accepted his arguments, but then we got back to the same initial idea.
The problem is that we are not aware of not being rational, that we do not value information objectively but that we select the data with tweezers, eliminating everything that does not fit into our worldview. All this leads us to a circular reasoning, to an intellectual immobility where there is no room for growth.
Nietzsche had already alerted us: “We have an energetic tendency for assimilating the old to the new, simplifying the complex and overlooking or repelling what is wholly contradictory […] An apparently antithetical drive for ignorance, for shutting-out, and for contentment with a closed horizon, for saying yes to ignorance and giving it for good”.
Why are we convinced to be right?
1. Emotional bond. Emotions are powerful incentives that act below the level of our consciousness directing our thinking. As a result, if we want something to be true, we will look for evidence that affirms it and ignore those that refute it.
2. Avoid cognitive dissonance. When new information contradicts our belief system, cognitive dissonance occurs that generates a state of anxiety. Many times, to avoid the arduous intellectual work that represents assuming a different perspective and changing our points of view, we simply remain tied to our vision, victims of intellectual laziness.
3. Maintain a positive self-image. Our beliefs, values and ideas are part of our identity. When new information questions them, we can feel that are attacking our ego. If we have a fragile ego, we will have the tendency to lock ourselves in to “protect ourselves.” As a result, we will reject opposing arguments and become even more attached to ours.
4. Presumption of objectivity. We start from the fact that we are rational people and assume that we are also objective, we assume that our ideas are objective. An analysis carried out at Stanford University revealed that the recalls to be more “rational”, “impartial” or “open-minded” actually have the opposite effect generating resistance to the new information, making us think that they want to manipulate us. They put us on the defensive and “turn off” our rational mind.
5. Cultural validation. We share many of our ideas, beliefs and values with other people. These common points make us belong to certain groups that provide us with affinity ties that protect our identity as they end up validating our worldview. Accepting ideas contrary to the group we belong to can generate a sense of uprooting that makes us feel bad.
The solution? Develop the mindset of the explorer
When we think of something, two different systems are put in place. The first system is fast, intuitive and emotional, so it is prone to suffer all kinds of cognitive biases. The second system is activated later, being more reflective, logical and accurate.
That allows us to separate the emotional reaction, and what we would like it to be, from the facts. It allows us to think: “I would like climate change not to be true, but maybe it is. I better analyze the evidence.”
Motivated reasoning does not allow this type of analysis. It jumps directly to hasty conclusions, based on emotions, expectations and beliefs. To avoid this bias, Julia Galef proposes to develop the mindset of the explorer.
It is a curious mindset, open to change and willing to explore new ideas. This mentality does not close to the different or to what contradicts our thoughts and expectations, but feels interest in it and investigates it in greater depth.
This mentality allows us to be aware that our self-esteem does not directly depend on how much reason we can have. That means that, in order to be more logical, objective and rational, we don’t really need to be more logical and rational, but we need to learn to separate ourselves from the ego and understand that, if we are wrong, it means that we have learned something new. And that is good.
Remember this quote of Confucius: “Neither approve a person for expressing a certain opinion, nor reject a particular opinion for coming from a certain person.” We must open ourselves to the ideas and value them. We should not even assume that some ideas are more valid just because come from us. Then, and only then, can we grow.
Sources:
Epley, N. & Gilovich, T. (2016) The Mechanics of Motivated Reasoning. Journal of Economic Perspectives; 30(3): 133–140.
Cohen, G. L. (2012) Identity, Belief, and Bias. En: Ideology, Psychology, and Law. J. Oxford: Hanson (Ed.).
Ditto, P. H. & Lopez, D. L. (1992) Motivated skepticism: Use of differential decision criteria for preferred and nonpreferred conclusions. Journal of Personality and Social Psychology; 63: 568-584.
Kunda, Z. (1990) The case for motivated reasoning. Psychological Bulletin; 108: 480-198.
Kunda, Z. (1987) Motivated inference: Self-serving generation and evaluation of causal theories. Journal of Personality and Social Psychology; 53: 636-647.
Hastorf, A. H. & Cantril, H. (1954) They saw a game; a case study. The Journal of Abnormal and Social Psychology; 49(1): 129-134.