Confirmation bias refers to a type of selective thinking whereby one tends to notice and look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs. For example, if you believe that during a full moon there is an increase in admissions to the emergency room where you work, you will take notice of admissions during a full moon but be inattentive to the moon when admissions occur during other nights of the month. A tendency to do this over time unjustifiably strengthens your belief in the relationship between the full moon and accidents and other lunar effects.
This tendency to give more attention and weight to data that support our beliefs than we do to contrary data is especially pernicious when our beliefs are little more than prejudices. If our beliefs are firmly established on solid evidence and valid confirmatory experiments, the tendency to give more attention and weight to data that fit with our beliefs should not lead us astray as a rule. Of course, if we become blinded to evidence truly refuting a favored hypothesis, we have crossed the line from reasonableness to closed-mindedness.
Numerous studies have demonstrated that people generally give an excessive amount of value to confirmatory information, that is, to positive or supportive data. The “most likely reason for the excessive influence of confirmatory information is that it is easier to deal with cognitively” (Thomas Gilovich, How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life). It is much easier to see how data support a position than it is to see how they might count against the position. Consider a typical ESP experiment or a seemingly clairvoyant dream: Successes are often unambiguous or data are easily massaged to count as successes, while negative instances require intellectual effort to even see them as negative or to consider them as significant. The tendency to give more attention and weight to the positive and the confirmatory has been shown to influence memory. When digging into our memories for data relevant to a position, we are more likely to recall data that confirms the position.
Researchers are sometimes guilty of confirmation bias by setting up experiments or framing their data in ways that will tend to confirm their hypotheses.
More: Unnatural Acts that can improve your thinking: confirmation bias.
by Robert T. Carroll
The illusion of skill refers to the belief that skill, not chance or luck, accounts for the accuracy of predictions of things that are unpredictable, such as the long-term weather forecasts one finds in farmers’ almanacs and the predictions of market gurus about the long-term ups and downs of the stock market. The illusion of skill also accurately describes the apparent accuracy of remote viewers. Given all the guesses remote viewers make about what they claim to see telepathically, chance alone would account for some of those guesses being somewhat accurate. Much of the accuracy ascribed to remote viewers, however, is due to the liberal and generous interpretations given by themselves or “experts” of their vague and ambiguous descriptions of places or things. Also, subjective validation accounts for the illusion of skill of “experts” in such fields as palm reading, mediumship, astrology, and criminal profiling.
Stock gurus–people who predict the rise and fall of the price of stocks and have large numbers of people who act on their predictions–are essentially part of an entire industry “built largely on an illusion of skill” (Daniel Kahneman, Thinking, Fast and Slow, p. 212). No market guru has gone broke selling advice, however, despite the fact that market newsletters are–in the words of William A. Sherden–”the modern day equivalent of farmers’ almanacs” (The Fortune Sellers, p. 102). In 1994, the Hurlbert Financial Digest found that over a five-year period only one out of 108 market-timing newsletters beat the market. You might think that that one did so because of skill, but you’d be wrong. Chance alone would predict that more than one out of 108 would beat the market.
Keep Reading: Unnatural Acts that can improve your thinking: illusion of skill.
Inattentional blindness is an inability to perceive something that is within one’s direct perceptual field because one is attending to something else. The term was coined by psychologists Arien Mack and Irvin Rock, who identified the phenomenon while studying the relationship of attention to perception. They were able to show that, under a number of different conditions, if subjects were not attending to a visual stimulus but were attending to something else in the visual field, a significant percentage of the subjects were “blind” to something that was right before their eyes.
Because this inability to perceive, this sighted blindness, seemed to be caused by the fact that subjects were not attending to the stimulus but instead were attending to something else … we labeled this phenomenon inattentional blindness (IB).*
Mack and Rock go on to argue that, in their view, “there is no conscious perception without attention.” We might add that visual perception does not work like a video or any other kind of recorder. Objects or movements may occur in the visual field that are not attended to and may not be consciously or unconsciously perceived. Things can change in the visual field without our being aware of the changes. Perception, like memory, is a constructive process, and it seems that the brain builds its representations from a few salient details, often determined by our purposes or desires. Thus, two people may witness the same events but see and remember quite different things, even if both are good observers paying close attention to what is going on.
Read More: Unnatural Acts that can improve your thinking: inattentional blindness.
The illusion of understanding occurs frequently due to selection bias and confirmation bias. By selecting only data that support one’s position and ignoring relevant data that would falsify or compromise one’s position, one can produce a convincing but misleading argument. By seeking only examples that confirm one’s belief and by ignoring examples that disconfirm it or reveal the insignificance of the data you’ve put forth, one can easily create the illusion of understanding. The illusion of understanding is particularly prominent in the field of economic forecasting.
Think about it. If stock analysts could really beat the market consistently, wouldn’t they be stinking rich? Do you really think they are a clan of benevolent elves whose only goal is to help people like you get rich from their technical advice? Their cousins appear in infomercials all the time, telling stories about unfathomable riches that await you if you invest in their program. That’s how they make their money: not by using their program, but by selling it to others!
Keep Reading: Unnatural Acts that can improve your thinking: illusion of understanding.