Tag Archives: Unnatural Acts that can improve your thinking

The Halo Effect

via Unnatural Acts that can improve your thinking

halo_effect 1126The halo effect refers to a bias whereby the perception of a positive trait in a person or product positively influences further judgments about traits of that person or products by the same manufacturer. One of the more common halo effects is the judgment that a good looking person is intelligent and amiable.

There is also a reverse halo effect whereby perception of a negative or undesirable trait in individuals, brands, or other things influences further negative judgments about the traits of that individual, brand, etc. If a person “looks evil” or “looks guilty” you may judge anything he says or does with suspicion; eventually you may feel confident that you have confirmed your first impression with solid evidence when, in fact, your evidence is completely tainted and conditioned by your first impression. The hope that the halo effect will influence a judge or jury is one reason some criminal lawyers might like their clients to be clean-shaven and dressed neatly when they appear at trial.

The phrase was coined by psychologist Edward Thorndike in 1920 to describe the way commanding officers rated their soldiers. He found that officers usually judged their men as being either good or bad “right across the board. There was little mixing of traits; few people were said to be good in one respect but bad in another.”* The old saying that first impressions make lasting impressions is at the heart of the halo effect. If a soldier made a good (or bad) first impression on his commanding officer, that impression would influence the officer’s judgment of future behavior. It is  very unlikely that given a group of soldiers every one of them would be totally good or totally bad at everything, but the evaluations seemed to indicate that this was the case. More likely, however, the earlier perceptions either positively or negatively affected those later perceptions and judgments.

The halo effect seems to ride on the coattails of confirmation bias: once we’ve made a judgment about positive or negative traits, that judgment influences future perceptions so that they confirm our initial judgment.

Some researchers have found evidence that student evaluations of their college instructors are formed and remain stable after only a few minutes or hours in class.  If a student evaluated a teacher highly early on in the course, he or she was likely to rank the teacher highly at the end of the course. Unfortunately, for those teachers who made bad first impressions on the students, their performance over the course of the term would be largely irrelevant to how they would be perceived by their students.

MORE . . .

Anecdotal Evidence (testimonials)

via Unnatural Acts that can improve your thinking

anecdote_250pxTestimonials and anecdotes are used to support claims in many fields. Advertisers often rely on testimonials to persuade consumers of the effectiveness or value of their products or services. Others use anecdotes to drive home the horror of some alleged activity or the danger of widely-used electronic devices like cell phones. In the mid-90s, there were many people, some in law enforcement, claiming that Satanists were abducting and abusing children on a massive scale. The anecdotes involved vivid descriptions of horrible sexual abuse, even murder of innocent children. The anecdotes were quite convincing, especially when they were repeated on nationally televised programs with popular hosts like Geraldo Rivera. A four-year study in the early 1990s found the allegations of satanic ritual abuse to be without merit. Researchers investigated more than 12,000 accusations and surveyed more than 11,000 psychiatric, social service, and law enforcement personnel. The researchers could find no unequivocal evidence for a single case of satanic cult ritual abuse.

There have also been scares fueled by anecdotes regarding such disparate items as silicone breast implants, cell phones, and vaccinations. In the 1990s many women blamed their cancers and other diseases on breast implants. Talk show hosts like Oprah Winfrey and Jenny Jones presented groups of women who were suffering from cancer or some other serious disease and who had been diagnosed after they’d had breast implants. The stories tugged at the heartstrings and brought tears to many sensitive eyes, but the scientific evidence did not exist that there was a causal connection between the implants and any disease. That fact did not prevent lawyers from extorting $4.25 billion from implant manufacturers. Marcia Angell, former executive editor of the New England Journal of Medicine, brought the wrath of feminist hell upon herself in 1992 when she wrote an editorial challenging the Food and Drug Administration’s decision to ban the manufacture of silicone breast implants. The scientific evidence wasn’t there to justify the ban. She eventually wrote a book describing the fiasco: Science on Trial: The Clash of Medical Evidence and the Law in the Breast Implant Case. The scientific evidence is now in. The implants don’t cause cancer or other diseases, and the FDA has lifted its ban. When the data were collected, they showed that women with silicone breast implants did not suffer cancer or any other disease at a significantly higher rate than women who had not had implants.

The public fear that cellphones might be causing brain tumors was first aroused not by scientists but by a talk show host. On January 23, 1993, Larry King’s guest was David Reynard, who announced that he and his wife Susan had sued NEC and GTE on the grounds that the cellphone David gave Susan caused his wife’s brain tumor. There was nothing but junk science to back up her claim, plus the fact that . . .

MORE . . .

Psychic Tricks, Fraud and Forer

Forer Effect

via Unnatural Acts that can improve your thinking

forerThe Forer effect refers to the tendency of people to rate sets of statements as highly accurate for them personally even though the statements were not made about them personally and could apply to many people.

Psychologist Bertram R. Forer (1914-2000) found that people tend to accept vague and general personality descriptions as uniquely applicable to themselves without realizing that the same description could be applied to many people. Consider the following as if it were given to you as an evaluation of your personality.

You have a need for other people to like and admire you, and yet you tend to be critical of yourself. While you have some personality weaknesses you are generally able to compensate for them. You have considerable unused capacity that you have not turned to your advantage. Disciplined and self-controlled on the outside, you tend to be worrisome and insecure on the inside. At times you have serious doubts as to whether you have made the right decision or done the right thing. You prefer a certain amount of change and variety and become dissatisfied when hemmed in by restrictions and limitations. You also pride yourself as an independent thinker; and do not accept others’ statements without satisfactory proof. But you have found it unwise to be too frank in revealing yourself to others. At times you are extroverted, affable, and sociable, while at other times you are introverted, wary, and reserved. Some of your aspirations tend to be rather unrealistic.

Forer gave a personality test to his students, ignored their answers, and gave each student the above evaluation (taken from a newsstand astrology column). He asked them to evaluate the evaluation from 0 to 5, with “5” meaning the recipient felt the evaluation was an “excellent” assessment and “4” meaning the assessment was “good.” The class average evaluation was 4.26. That was in 1948. The test has been repeated hundreds of time with psychology students and the average is still around 4.2 out of 5, or 84% accurate.

In short, Forer convinced people he could successfully read their character. His accuracy amazed his subjects, though his personality analysis was taken from a newsstand astrology column and was presented to people without regard to their sun sign. The Forer effect seems to explain, in part at least, why so many people think that pseudosciences “work”. Astrology, astrotherapy, biorhythms, cartomancy, chiromancy, the enneagram, fortune telling, graphology, rumpology, etc., seem to work because they seem to provide accurate personality analyses. crystal_ball_01Scientific studies of these pseudosciences demonstrate that they are not valid personality assessment tools, yet each has many satisfied customers who are convinced they are accurate.

The most common explanations given to account for the Forer effect are in terms of hope, wishful thinking, vanity, and the tendency to try to make sense out of experience. Forer’s own explanation was in terms of human gullibility. People tend to accept claims about themselves in proportion to their desire that the claims be true rather than in proportion to the empirical accuracy of the claims as measured by some non-subjective standard. We tend to accept questionable, even false statements about ourselves, if we deem them positive or flattering enough. We will often give very liberal interpretations to vague or inconsistent claims about ourselves in order to make sense out of the claims. Subjects who seek counseling from psychics, mediums, fortune tellers, mind readers, graphologists, etc., will often ignore false or questionable claims and, in many cases, by their own words or actions, will provide most of the information they erroneously attribute to a pseudoscientific counselor. Many such subjects often feel their counselors have provided them with profound and personal information. Such subjective validation, however, is of little scientific value.
MORE . . .

James Randi‘s fiery takedown of psychic fraud

via http://www.ted.com

Legendary skeptic James Randi takes a fatal dose of homeopathic sleeping pills onstage, kicking off a searing 18-minute indictment of irrational beliefs. He throws out a challenge to the world’s psychics: Prove what you do is real, and I’ll give you a million dollars. (No takers yet.)

The Tricks

via Project Barnum

• Cold Reading

Making vague statements that will fit most people if they want them to

Cold reading is a series of techniques employed by psychics, mediums and mentalists that are used to manipulate the customer (sitter) into believing that the psychic can read their mind, or that the medium is in contact with a dead relative or friend.

A cold reading will involved things that are called ‘Forer Statements’ (or or Barnum statements) which are designed to encourage the sitter to fill in the gaps in the information being given. Though these statements may appear to be specific they are really very open-ended and vague and could really apply to anyone. Experiments have shown how similar statements can be taken personally when issued to dozens of people at the same time!

Some examples of such statements would be:

  • “I sense that you are sometimes insecure, especially with people you don’t know very well.”
  • “You work with computers”
  • “You’re having problems with a friend or relative”

Here is ‘psychic’ James Van Prag demonstrating what appears to be a very embarrassing cold reading:

• Rainbow Ruse

Ticking all potential boxes by making all-encompassing descriptions

Similar to Forer statements is the “rainbow ruse” which involves a statement that covers all possibilities and often describe somebody as being two completely different types of person at the same time. Here are some examples:

  • “Most of the time you are positive and cheerful, but there has been a time in the past when you were very upset.”
  • “You are a very kind and considerate person, but occasionally you feel deep-seated anger.”
  • “I would say that you are mostly shy and quiet, but when the mood strikes you, you can easily become the centre of attention.”

• Hot/warm Reading

Using information gained before the show about the audience

MORE . . .

More via Granit State Skeptics: learn how psychics work by reading the Psychic Pamphlet.

Magical Thinking

via Unnatural Acts that can improve your thinking

 magical thinking is “a fundamental dimension of a child’s thinking.”
Zusne and Jones

Magical thinking is a belief in the interconnectedness of all things through forces and powers that transcend physical connections. Magical thinking invests special powers and forces in things and sees them as symbols on various levels. According to anthropologist Dr. Phillips Stevens Jr., “the vast majority of the world’s peoples … believe that there are real connections between the symbol and its referent, and that some real and potentially measurable power flows between them.” He believes there is a neurobiological basis for this, though the specific content of any symbol is culturally determined. (“Magical Thinking in Complementary and Alternative Medicine,” Skeptical Inquirer, 2001, November/December.)

One of the driving principles of magical thinking is the notion that things that resemble each other are causally connected in some way that defies scientific testing (the so-called law of similarity, that like produces like, that effect resembles cause). Another driving principle of magical thinking is the belief that “things that have been either in physical contact or in spatial or temporal association with other things retain a connection after they are separated” (the so-called law of contagion) (James George Frazer, The Golden Bough: A Study in Magic and Religion; Stevens). Think of relics of saints that are supposed to transfer spiritual energy. Think of psychic detectives claiming that they can get information about a missing person by touching an object that belongs to the person (psychometry). Or think of the pet psychic who claims she can read your dog’s mind by looking at a photo of the dog. Or think of Rupert Sheldrake’s morphic resonance, the idea that there are mysterious telepathy-type interconnections between organisms and collective memories within species. (Coincidentally, Sheldrake also studies psychic dogs

MORE . . .

Cognitive Dissonance

via Unnatural Acts that can improve your thinking

Cognitive dissonance is a theory of human motivation that asserts that it is psychologically uncomfortable to hold contradictory cognitions. The theory is that dissonance, being unpleasant, motivates a person to change his cognition, attitude, or behavior. This theory was first explored in detail by social psychologist Leon Festinger, who described it this way:
Dissonance and consonance are relations among cognitions that is, among opinions, beliefs, knowledge of the environment, and knowledge of one’s own actions and feelings. Two opinions, or beliefs, or items of knowledge are dissonant with each other if they do not fit together; that is, if they are inconsistent, or if, considering only the particular two items, one does not follow from the other (Festinger 1956: 25).
He argued that there are three ways to deal with cognitive dissonance. He did not consider these mutually exclusive.
  1. One may try to change one or more of the beliefs, opinions, or behaviors involved in the dissonance;
  2. One may try to acquire new information or beliefs that will increase the existing consonance and thus cause the total dissonance to be reduced; or,
  3. One may try to forget or reduce the importance of those cognitions that are in a dissonant relationship (Festinger 1956: 25-26).
For example, people who smoke know smoking is a bad habit. Some rationalize their behavior by looking on the bright side: They tell themselves that smoking helps keep the weight down and that there is a greater threat to health from being overweight than from smoking. Others quit smoking. Most of us are clever enough to come up with ad hoc hypotheses or rationalizations to save cherished notions. Why we can’t apply this cleverness more competently is not explained by noting that we are led to rationalize because we are trying to reduce or eliminate cognitive dissonance. Different people deal with psychological discomfort in different ways. Some ways are clearly more reasonable than others. So, why do some people react to dissonance with cognitive competence, while others respond with cognitive incompetence?
Cognitive dissonance has been called “the mind controller’s best friend” (Levine 2003: 202). Yet, a cursory examination of cognitive dissonance reveals that it is not the dissonance, but how people deal with it, that would be of interest to someone trying to control others when the evidence seems against them.

MORE . . .

Hindsight Bias

via Unnatural Acts that can improve your thinking

“The mind that makes up narratives about the past is a sense-making organ. When an unpredicted event occurs, we immediately adjust our view of the world to accommodate the surprise.”–Daniel Kahneman

Hindsight bias is the tendency to construct one’s memory after the fact (or interpret the meaning of something said in the past) according to currently known facts and one’s current beliefs. In this way, one appears to make the past consistent with the present and more predictive or predictable than it actually was. When a surprise event occurs and you say “I knew it all along,” you probably didn’t. Hindsight bias may be kicking in.

Hindsight bias accounts for the tendency of believers in prophecies and psychic predictions to retrofit events to past oracular claims, however vague or obscure (retroactive clairvoyance). For example, after the Challenger space shuttle disaster that killed seven U.S. astronauts on January 28, 1986, hindsight bias was used by followers of Nostradamus to claim that he had predicted it in the following verse:

D’humain troupeau neuf seront mis à part,
De jugement & conseil separés:
Leur sort sera divisé en départ,
Kappa, Thita, Lambda mors bannis égarés.

From the human flock nine will be sent away,
Separated from judgment and counsel:
Their fate will be sealed on departure
Kappa, Thita, Lambda the banished dead err (I.81).

Of course, to make the obscene retrodiction complete, Nostradamus’s minions would have to speculate that teacher-astronaut Christa McAuliffe was pregnant with twins to make nine the total in the “flock.” The belief that one can predict the future is often due to little more than the power of hindsight bias.

Hindsight bias also seems to account for the tendency of many people to think they can explain events that weren’t predicted after the events have happened. It is unacceptable to many people to think that major events like a respected Wall Street investment manager running a Ponzi scheme that cost people perhaps as much as $50 billion wasn’t predictable. If only somebody had paid attention to this and that detail, Bernard Madoff could never have pulled it off. What is true is that a major impact event like this can be easily explained after the fact. The explanations may satisfy people and lead them to believe that they now understand how such an event happened, but there is no way to know whether collecting many facts and using them to explain what occurred will help prevent a similar event from happening in the future.

Why do we engage in hindsight bias? There are several reasons.

MORE . . .

Recency Bias

via Unnatural Acts that can improve your thinking

Recency bias is the tendency to think that trends and patterns we observe in the recent past will continue in the future. Predicting the future in the short term, even for highly changeable events like the weather or the stock market, according to events in the recent past, works fine much of the time. Predicting the future in the long term according to what has recently occurred has been shown to be no more accurate than flipping a coin in many fields, including meteorology, economics, investments, technology assessment, demography, futurology, and organizational planning (Sherden, The Future Sellers).

Doesn’t it strike you as odd that with all the intelligence supposedly going on that such things as the breakup of the Soviet Union, the crumbling of the Berlin wall, the former head of Sinn Fein meeting with the Queen of England, the worldwide economic collapse of recent years, the so called “Arab spring,” the recent attacks on U.S. embassies in several Muslim countries, and a host of other significant historical events were not predicted by the experts? Wait, you say. So-and-so predicted this or that. Was it a lucky guess or was the prediction based on knowledge and skill? If the latter, we’d expect not just one correct prediction out of thousands, but a better track record than, say, flipping a coin. Find one expert who’s consistently right about anything and we still have a problem. How can we be sure that this sharpshooter isn’t just lucky. If thousands of people are making predictions, chance alone tells us that a few will make a right call now and then. The odds in favor of prediction success diminish the more events we bring in, but even someone who seems to defy the odds might be the one a million that gets lucky with a string of guesses. You flip the coin enough times and once in a while you will get seven heads in a row. It’s not expected, but it is predicted by the laws of chance. Likewise with predicting how many hurricanes we’ll have next year or what stocks to buy or sell  this year.

MORE . . .

%d bloggers like this: