The human brain is capable of 1016 processes per second, which makes it far more powerful than any computer currently in existence. But that doesn’t mean our brains don’t have major limitations. The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless — plus, we’re subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions. Here are a dozen of the most common and pernicious cognitive biases that you need to know about.
Before we start, it’s important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).
Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them. Here are some important ones to keep in mind.
We love to agree with people who agree with us. It’s why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes. We tend to be put off by individuals, groups, and news sources that make us feel uncomfortable or insecure about our views — what the behavioral psychologist B. F. Skinner called cognitive dissonance. It’s this preferential mode of behavior that leads to the confirmation bias — the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter how valid — that threaten our world view. And paradoxically, the internet has only made this tendency even worse.
Somewhat similar to the confirmation bias is the ingroup bias, a manifestation of our innate tribalistic tendencies. And strangely, much of this effect may have to do with oxytocin — the so-called “love molecule.” This neurotransmitter, while helping us to forge tighter bonds with people in our ingroup, performs the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others. Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know.
It’s called a fallacy, but it’s more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they’ll somehow influence future outcomes. The classic example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%.
Relatedly, there’s also the positive expectation bias — which often fuels gambling addictions. It’s the sense that our luck has to eventually change and that good fortune is on the way. It also contribues to the “hot hand” misconception. Similarly, it’s the same feeling we get when we start a new relationship that leads us to believe it will be better than the last one.
Remember that time you bought something totally unnecessary, faulty, or overly expense, and then you rationalized the purchase to such an extent that you convinced yourself it was a great idea all along? Yeah, that’s post-purchase rationalization in action — a kind of built-in mechanism that makes us feel better after we make crappy decisions, especially at the cash register. Also known as Buyer’s Stockholm Syndrome, it’s a way of subconsciously justifying our purchases — especially expensive ones. Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.
Very few of us have a problem getting into a car and going for a drive, but many of us experience great trepidation about stepping inside an airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly unnatural and seemingly hazardous activity. Yet virtually all of us know and acknowledge the fact that the probability of dying in an auto accident is significantly greater than getting killed in a plane crash — but our brains won’t release us from this crystal clear logic (statistically, we have a 1 in 84 chance of dying in a vehicular accident, as compared to a 1 in 5,000 chance of dying in an plane crash [other sources indicate odds as high as 1 in 20,000]). It’s the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.
This is what the social psychologist Cass Sunstein calls probability neglect — our inability to properly grasp a proper sense of peril and risk — which often leads us to overstate the risks of relatively harmless activities, while forcing us to overrate more dangerous ones.
- Your Brain Is Flawed — 12 Scientific Reasons Human Beings Are Wildly Irrational (alternet.org)
- The 12 Cognitive Biases That Prevent You From Being Rational (theageofblasphemy.wordpress.com)
- The 12 cognitive biases that prevent you from being rational (richarddawkins.net)
- Do our cognitive biases help or hinder us as entrepreneurs? As leaders? (entrepreneurshipmatters.com)
- The Power of Confirmation Bias (illuminutti.com)
- The Power of Confirmation Bias (theness.com)
Humans have debated the issue of free will for millennia. But over the past several years, while the philosophers continue to argue about the metaphysical underpinnings of human choice, an increasing number of neuroscientists have started to tackle the issue head on — quite literally. And some of them believe that their experiments reveal that our subjective experience of freedom may be nothing more than an illusion. Here’s why you probably don’t have free will.
Indeed, historically speaking, philosophers have had plenty to say on the matter. Their ruminations have given rise to such considerations as cosmological determinism (the notion that everything proceeds over the course of time in a predictable way, making free will impossible), indeterminism (the idea that the universe and our actions within it are random, also making free will impossible), and cosmological libertarianism/compatibilism (the suggestion that free will is logically compatible with deterministic views of the universe).
Now, while these lines of inquiry are clearly important, one cannot help but feel that they’re also terribly unhelpful and inadequate. What the debate needs is some actual science — something a bit more…testable.
And indeed, this is starting to happen. As the early results of scientific brain experiments are showing, our minds appear to be making decisions before we’re actually aware of them — and at times by a significant degree. It’s a disturbing observation that has led some neuroscientists to conclude that we’re less in control of our choices than we think — at least as far as some basic movements and tasks are concerned.
At the same time, however, not everyone is convinced. It may be a while before we can truly prove that free will is an illusion.
MORE . . .
- Science & Technology – Re: The Grand Illusion of Free Will. Your Brain is Lying to (disclose.tv)
- Scientific evidence that you probably don’t have free will (io9.com)
- The problem of Free Will – Compatibilism versus Determinism (broganocallaghan.wordpress.com)
- Security as an Illusion? (thetawny.wordpress.com)
- Can Philosophy of Mind Provide Reasons for Believing in God? (qdvf.wordpress.com)
- A view on the will of an agent. (dominicsmithies.wordpress.com)