Last Updated on
Stop for a second and answer these simple questions:
In the Bible, which animal swallowed Jonah?
How many animals of each type did Moses carry in the ark?
If you are like most people, you will have answered “whale” to the first question and “two” to the second. Very few people realize that it was not Moses, but Noah, who built the ark according to the Bible.
This phenomenon is known as the Moses’ Illusion and it has profound implications in our daily life because it reflects our inability to detect mistakes in the world. Even if we know the correct information, we have a tendency to overlook the errors.
Blindness to errors
In 1981 two psychologists, Thomas D. Erickson of the University of California, and Mark E. Mattson of the State University of New York, discovered that 80% of people did not notice the error in the questions.
The funny thing was the participants did not notice the error even when they were warned that some questions might be wrong or when the time pressure was eliminated, so that they could think more calmly and not be stressed by answering.
Duke University psychologists went one step further and replicated that experiment, but highlighting important data in red that participants needed to evaluate further. The results were disastrous.
Most of the people not only continued without noticing the error in the questions, but in a later test they included those erroneous data in their answers, which indicates they had incorporated them into their conception of the world.
The problem is that just a few days before the test the psychologists assessed their knowledge and was correct. That means that even though we apparently don’t look at the wrong details, our minds are actually taking note of them and incorporating them into our knowledge system.
Everything is true until proven otherwise
We all think to be smart and that if we see an error or false information we will notice and not believe it. But in reality we can all be deceived. The Moses’ Illusion is based on our way of processing information.
Spinoza hypothesized that when we are facing an idea, instead of following a logical path of evaluation to accept or reject it, what we do is accept it automatically. The rejection would be a second step that requires more cognitive effort.
Science confirms his hypothesis. Researchers from the University of Texas asked a group of people to act as judges indicating what sentence they would apply to two criminals who had committed a crime. The “catch” was that the police reports contained true and false statements, each in different colors.
Although the participants were warned that the reports contained false data and told them what they were, they recommended almost twice the length of the condemn when false statements exacerbated the severity of the crime. This shows that, at first, we assume what we read or hear as true and only after reflecting on it we can classify it as false.
Why are we positively biased?
The Truth-Default Theory (TDT)
We are all prone to what is known as “truth bias”, which occurs regardless of the source of the information or prior knowledge we have.
Based on the Truth-Default Theory (TDT), we assume that the others are honest. We do not think of deception as a possibility in communication until we have clues that make us doubt. In fact, a study at the University of Alabama indicated that our accuracy for detecting lies is less than 50%.
That initial tendency to assume statements as true is likely to be a bias intended to facilitate communication. After all, it is much easier to assume that the person in front of us is telling us the truth than to go through a “lie detector” everything he says.
In fact, we do not fall in the Moses’ Illusion when the information is clearly wrong. Psychologists at Northwestern University found that we rely less on incredible inaccuracies than on plausible ones. So if we were asked, “How many animals of each type did Kennedy carry on the ark?” We would have noticed the error. The problem is when the information is plausible.
Is it possible to escape the Moses’ Illusion?
Having experience or greater knowledge on certain topics will allow us to be more prepared to detect errors, falsehoods and misinformation. A study conducted at Duke University, for example, found that History students better detect historical errors in claims than Biology students, and vice versa. However, prior knowledge is not enough because many times we do not use it.
An experiment carried out at Vanderbilt University found that the most effective way to reduce the Moses’ Illusion is to act as if we were fact-checkers. In other words, assuming a critical attitude from the beginning and verifying all the information.
It is a considerable cognitive effort, but activating our critical thinking is the only way to protect ourselves from manipulation, deception and disinformation.
Fazio, L. (2020) Why you stink atfact-checking. In: The Conversation.
Cantir, A. D. & Marsh, E. J. (2017) Expertise effects in the Moses illusion: detecting contradictions with stored knowledge. Memory; 25(2): 220-230.
Hinze, S. R. et. Al. (2014) Pilgrims sailing the Titanic: Plausibility effects on memory for misinformation. Memory & Cognition; 42: 305–324.
Marsh, E. J., & Umanath, S. (2014) Knowledge neglect: Failures to notice contradictions with stored knowledge. En: D. N. Rapp & J. L. G. Braasch (Eds.), Processing inaccurate information: Theoretical and applied perspectives from cognitive science and the educational sciences (p. 161–180).
Levine, T. R. (2014) Truth-Default Theory (TDT): A Theory of Human Deception and Deception Detection. Journal of Language and Social Psychology; 33(4): 378-392.
Eslick, A. N. et. Al. (2011) Ironic effects of drawing attention to story errors. Memory; 19(2): 184-191.
Marsh, E. J. & Fazio, L. K. (2006) Learning errors from fiction: difficulties in reducing reliance on fictional stories. Mem Cognit; 34(5):1140-9.
Gilbert, D. T. (1993) You can’t not believe everything you read. J Pers Soc Psychol; 65(2):221-33.
Erickson, T. D. & Mattson, M. E. (1981) From words to meaning: A semantic illusion. Journal of Verbal Learning and Verbal Behavior; 20(5): 540-551.