Tag Archives: Cognitive bias

Are Stupid People More Confident?

This might explains why some conspiracists believe what they believe.🙂

By BrainStuff via YouTube

We’ve all heard about the supposed relationship between confidence and knowledge – but is it true? Two researchers think they’ve found the answer.

Also See:

The best Videos of the Week for 2013

by via The Soap Box

Last year I started putting up on this page one video per week.

Now I’ve had a lot of videos on here that were just great, and today I’ve decided to have a look back at what I consider to be the five best videos of the week for 2013:

5. Alex Jones As Alien Lizard Explains Obamacare

Probably every skeptic around the world knows who Alex Jones. While many skeptic bloggers have at least written up a couple of articles to either discredit him and/or show what kind of a fool he is, still by far the best person to discredit Alex Jones and to make him look like a fool… is Alex Jones.

This clip from Right Wing Watch’s Youtube page clearly shows why that’s true:

4. Debunking 9/11 conspiracy theorists part 6 of 7 – The psychology behind a 9/11 truther

From late 2012 to early 2013 Myles Power created a seven part series that is in my opinion one of the best 9/11 conspiracy theory debunking videos that I have ever seen, and the sixth video in the series, which explains the psychology and mindset of a 9/11 Truther, and infact most conspiracy theorists, could have itself been a stand alone video apart from the series.

MORE . . .

10 Tips for Telling Fact From Fiction

by via HowStuffWorks

[ . . . ]

10: Beware of Cognitive Bias

Confirmation bias: Selective thinking whereby one tends to notice and to look for what confirms one's beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one's beliefs.

Confirmation bias: Selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs.

Our brains are designed to make sense of the onslaught of sensory stimulation and information that they get from the world by filtering and organizing. We have a tendency to focus on certain details and ignore others, to avoid being overwhelmed. And we habitually organize information into patterns, based on things we’ve seen or learned about before. That leads us to process what we hear, read or see in a way that reinforces what we think we already know. That phenomenon is called cognitive bias (source: Science Daily).

To make matters worse, some theorize that we also engage in selective exposure — that is, we pick sources of information that tell us what we want to hear. Ohio State researchers, for example, found that when college students spent a few minutes reading news articles online, they selected ones that supported their already-held views 58 percent of the time (source: Hsu).

 The famous 1934 photograph of the Loch Ness monster. Just before his death in 1994, Chris Spurling confessed that he and some other men had staged the picture. Keystone/Getty Images

The Loch Ness monster
Keystone/Getty Images

So, we’re vulnerable to information that fits what we want to believe — even if it’s of dubious authenticity. That’s probably why the infamous photograph of the Loch Ness monster, taken in 1934 (source: Nickell), was so convincing for many people. The silhouette resembled a long-necked dinosaur, which was something they had seen pictures of in natural history textbooks. And the idea that ancient creatures might have survived extinction already had surfaced in fiction such as Arthur Conan Doyle’s 1912 novel “The Lost World,” so it wasn’t too much of a leap conceptually. It wasn’t until 1994 that researchers got an elderly man who had been part of the hoax to reveal that the monster in the photo actually was a foot-high model, fashioned from a toy submarine (source: Associated Press).

9: Pay Attention to the Unspoken Message

Used-Car-Salesman_250pxIf you’ve ever sold used cars or peddled vacuum sweepers door-to-door, you probably know this from experience: Researchers have found that an attractive physical appearance and positive nonverbal cues, like eye contact, smiling and a pleasant tone of voice, may have as much or more of an influence upon us than the actual words that the person is saying. In fact, someone who is skilled at nonverbal messaging can actually foster what communication experts call a halo effect. That is, if we think that a person looks good, we assume that he or she is intelligent or capable as well. That’s a big help in fostering credibility (source: Eadie). But just as a salesperson can learn to project a convincing demeanor, a swindler or a dishonest politician can practice the same tricks.

However, other nonverbal cues provide useful information for evaluating whether someone is telling the truth or a lie. Researchers who’ve studied the questioning of criminal suspects, for example, note that even highly motivated, skillful liars have a tendency to “leak” nonverbal clues to their deception in the course of a long interview, because of the difficulty of managing facial expressions, physical carriage, and tone of voice over time. The trick is to watch for those tiny flaws in the subject’s demeanor to emerge.

When making an untrue statement, for example, a person may flash a “microexpression”– a frown, perhaps, or a grimace — that reflects his or her true emotions, but clashes with what the person is saying. Since some of this microexpressions may happen as quickly as the blink of an eye, the easiest way to detect them is by replaying a video. But it is possible to do it in a real-time conversation as well. U.S. Coast Guard investigators trained in spotting such leakage, for example, have been able to spot such clues about 80 percent of the time (source: Matsumoto, et al.).

8: Watch for the Big Lie

 Master of the Big Lie, Adolf Hitler is welcomed by supporters at Nuremberg. Hulton Archive/Getty Images


Master of the Big Lie, Adolf Hitler is welcomed by supporters at Nuremberg.
Hulton Archive/Getty Images

Throughout history, purveyors of falsehoods seldom have bothered with piddling minor fibs. Instead, they generally have opted for what propaganda experts call the “Big Lie” — that is, a blatant, outrageous falsehood about some important issue, and one that’s usually designed to inflame listeners’ emotions and provoke them to whatever action the liar has in mind. The Big Lie is most often associated with Adolf Hitler, who advised in his book “Mein Kampf” that the “primitive simplicity” of ordinary people makes them vulnerable to massive deceptions. “It would never come into their heads to fabricate colossal untruths, and would not believe that others would have the impudence to distort the truth so infamously,” the Nazi dictator wrote.

Ironically, even as he explained the method of the Big Lie, he used it to promote an especially brazen untruth — that Jews and Communists somehow had deceived the German people into thinking that their nation’s loss in World War I was caused by reckless, incompetent military leaders. The Nazi dictator was onto something, though perhaps even his own twisted mind didn’t grasp it: Some of the most effective Big Lies are accusations of someone else being a liar (source: Hitler).

Hitler, of course, didn’t invent the Big Lie, and a liar doesn’t necessarily have to be a bloodthirsty dictator to pull it off. But the best way to protect yourself against the Big Lie is to be an educated, well-informed person who’s got a broad base of knowledge and context. Sadly, we live in a culture where fewer and fewer people seem to have that background. In a 2011, Newsweek gave 1,000 Americans the U.S. citizenship test; more than a third scored a failing grade — 60 percent or lower — to questions such as “How many justices are on the Supreme Court?” and “Who did the U.S. fight in World War II?” That’s kind of scary (source: Quigley).

MORE – – –

Attribution Biases

via Unnatural Acts that can improve your thinking

head clouds_250pxHuman behavior can be understood as issuing from “internal” factors or personal characteristics–such as motives, intentions, or personality traits–and from “external” factors–such as the physical or social environment and other factors deemed out of one’s personal control. Self-serving creatures that we are, we tend to attribute our own successes to our intelligence, knowledge, skill, perseverance, and other positive personal traits. Our failures are blamed on bad luck, sabotage by others, a lost lucky charm, and other such things. These attribution biases are referred to as the dispositional attribution bias and the situational attribution bias. They are applied in reverse when we try to explain the actions of others. Others succeed because they’re lucky or have connections and they fail because they’re stupid, wicked, or lazy.

We may tend to attribute the behaviors of others to their intentions because it is cognitively easier to do so. We often have no idea about the situational factors that might influence another person or cause them to do what they do. We can usually easily imagine, however, a personal motive or personality trait that could account for most human actions. We usually have little difficulty in seeing when situational factors are at play in affecting our own behavior. In fact, people tend to over-emphasize the role of the situation in their own behaviors and under-emphasize the role of their own personal motives or personality traits. Social psychologists refer to this tendency as the actor-observer bias.
brains gears_600px
One lesson here is that we should be careful when interpreting the behavior of others. What might appear to be laziness, dishonesty, or stupidity might be better explained by situational factors of which we are ignorant. Another lesson is that we might be giving ourselves more credit for our actions than we deserve. The situation may have driven us more than we admit. Maybe we “just did what anybody would do in that situation” or maybe we were just lucky. We may want to follow the classical Greek maxim “know thyself,” but modern neuroscience has awakened us to the fact that much of our thinking goes on at the unconscious level and we often don’t know what is really motivating us to do what we do or think what we think.

Something similar to the self-serving attribution of positive traits to explain our own behavior and negative traits to explain the behavior of others occurs with regard to beliefs.

MORE . . .

The 12 cognitive biases that prevent you from being rational

By George Dvorsky viA io9.com

The human brain is capable of 1016 processes per second, which makes it far more powerful than any computer currently in existence. But that doesn’t mean our brains don’t have major limitations. The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless — plus, we’re subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions. Here are a dozen of the most common and pernicious cognitive biases that you need to know about.

original_600px

Before we start, it’s important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).

Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them. Here are some important ones to keep in mind.

Confirmation Bias

ConfirmationBias2We love to agree with people who agree with us. It’s why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes. We tend to be put off by individuals, groups, and news sources that make us feel uncomfortable or insecure about our views — what the behavioral psychologist B. F. Skinner called cognitive dissonance. It’s this preferential mode of behavior that leads to the confirmation bias — the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter how valid — that threaten our world view. And paradoxically, the internet has only made this tendency even worse.

Ingroup Bias

medium_250pxSomewhat similar to the confirmation bias is the ingroup bias, a manifestation of our innate tribalistic tendencies. And strangely, much of this effect may have to do with oxytocin — the so-called “love molecule.” This neurotransmitter, while helping us to forge tighter bonds with people in our ingroup, performs the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others. Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know.

Gambler’s Fallacy

medium-1_250pxIt’s called a fallacy, but it’s more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they’ll somehow influence future outcomes. The classic example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%.

Relatedly, there’s also the positive expectation bias — which often fuels gambling addictions. It’s the sense that our luck has to eventually change and that good fortune is on the way. It also contribues to the “hot hand” misconception. Similarly, it’s the same feeling we get when we start a new relationship that leads us to believe it will be better than the last one.

Post-Purchase Rationalization

Remember that time you bought something totally unnecessary, faulty, or overly expense, and then you rationalized the purchase to such an extent that you convinced yourself it was a great idea all along? Yeah, that’s post-purchase rationalization in action — a kind of built-in mechanism that makes us feel better after we make crappy decisions, especially at the cash register. Also known as Buyer’s Stockholm Syndrome, it’s a way of subconsciously justifying our purchases — especially expensive ones. Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.

Neglecting Probability

medium-2_250pxVery few of us have a problem getting into a car and going for a drive, but many of us experience great trepidation about stepping inside an airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly unnatural and seemingly hazardous activity. Yet virtually all of us know and acknowledge the fact that the probability of dying in an auto accident is significantly greater than getting killed in a plane crash — but our brains won’t release us from this crystal clear logic (statistically, we have a 1 in 84 chance of dying in a vehicular accident, as compared to a 1 in 5,000 chance of dying in an plane crash [other sources indicate odds as high as 1 in 20,000]). It’s the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.

This is what the social psychologist Cass Sunstein calls probability neglect — our inability to properly grasp a proper sense of peril and risk — which often leads us to overstate the risks of relatively harmless activities, while forcing us to overrate more dangerous ones.

MORE . . .

The Higgs and Wishful Thinking

by Steven Novella via NeuroLogica Blog

“I’m Good Enough, I’m Smart Enough, and Doggone It, People Like Me!”
– Daily Affirmations With Stuart Smalley.

higgs-boson1

The Higgs boson or Higgs particle is an elementary particle in the Standard Model of particle physics. (more here)

Self-help books are full of advice for thinking positively, and using affirmations to tell ourselves that the reality we wish to be true is in fact true. This is interesting because psychologists have discovered that people in general have a large positive cognitive bias – a wishful thinking bias. All other things being equal, we will tend to assume that what we wish to be true is actually true. Sometimes we can maintain this belief despite significant contradictory evidence.

It may be that this bias exists because it relieves cognitive dissonance. Essentially, it makes us feel better, and that may be sufficient. However, there is also a theory that such wishful or positive thinking is, to an extent, self-fulfilling. People who think they will be successful will take advantage of opportunities and work harder to make that success a reality. Expectations can even affect other people, the so-called Pygmalian effect. If teachers believe that a student will perform better, that expectation may improve the student’s performance.

Richard Wiseman points out, however, that visualizing the goal (“I am a success in my business”) does not work (so much for positive affirmations). What is helpful is visualizing the process by which a goal can be achieved.

Within the “New Age” spiritual community, however, this psychological discussion over the impact of positive or wishful thinking is all moot. Within this community there is the widely held belief, or at least claim, that wishful thinking does not just create a successful attitude – it actually alters reality. This belief reached its pinnacle, perhaps, in the widely successful book, The Secret. This book promoted what it called the “Law of Attraction” – that wishing something to be true attracted that very thing to you. Essentially the secret is that the universe will answer your wishes – so wish away.

Wishful_Thinking_LogoThis is literally a childish attitude. Children often behave as if asking hard enough of the universe for something might produce the thing wished-for. Most adults have learned that the universe does not work this way – or perhaps they have just learned to hide this childish desire that they still harbor. They use their better developed frontal lobes to rationalize what they wish to be true (manifesting as a positive cognitive bias).  Reframing this wish-fulfillment desire as a “law” makes it sound a bit more respectable, however.  The Secret, and other such nonsense, in essence just gave some adults permission to embrace their childhood wish-fulfillment fantasy.

What does all this have to do with the Higgs boson?

A recent article by Mike Adams on his website, Divinity Now (Exploring Conscious Cosmology) argues that the scientists who “discovered” the Higgs actually got the results they wished for through “intention” – the word used by believers to refer to wishing, again to make it sound a bit more respectable. And yes – that is the same Mike Adams of NaturalNews infamy – the crank site that promotes, in my opinion, all sorts of medical pseudoscience. Apparently Adams is branching out into consciousness pseudoscience.

MORE . . .

Follow

Get every new post delivered to your Inbox.

Join 931 other followers

%d bloggers like this: