Rarely (outside of organized religion) has the combination of ignorance and fraud been as profitable as with so-called paranormal investigators.
Pseudoscience is like bubble gum. It tastes pretty good, it’s fun to blow bubbles, and it annoys some people. But eventually, the flavor leaves, and you find that you’re just chewing on some nutritionally dubious substance. Now you have to find a place to spit it out.
Or I guess you can swallow it, and it stays in your intestines for the rest of your life. Oh sorry, that’s more junk science.
If you read something that makes some medical claim, here’s a quick and easy checklist to determine if it’s pseudoscience. Or real science-based medicine. What we all need is an official, Skeptical Raptor endorsed, pseudoscience detector.
Here it is, your own pseudoscience detector, based on a scientific seven-point checklist for fake science.
When I was an intern doing a rotation in the emergency department, on one particularly busy shift a nurse commented (to no one in particular) that it must be a full moon. I habitually look at the moon and generally know what phase it is in (right now it is a waxing gibbous, almost full), and so I knew at the time that in fact there was a crescent moon in the sky. I informed her of this. She gave a disappointed look and then went on with her work without any apparent further thought on the matter.
The episode struck me at the time. It seemed to me that I just witnessed a clear example of confirmation bias – what if it had been near a full moon? That would have confirmed her prior belief in a lunar effect, while this negative correlation was brushed aside and likely did not have any negative effect on her belief. (Although, my interpretation and memory of this event can itself be an example of confirmation bias regarding confirmation bias.)
Belief in the so-called lunar effect, that the phases of the moon exert an influence on human behavior with the most common element being a full-moon inducing extreme behavior, is very common. In my experience it is one of the most common pseudoscientific beliefs I encounter in the general public. One survey indicates that 43% of adults believe in the lunar effect, especially mental health professionals, including nurses.
When someone expresses such a belief to me I often use it as an opening to discuss skeptical principles. While belief in the lunar effect is widespread, it is usually not part of any emotionally held religious or ideological belief. It is therefore an excellent teaching opportunity. One question I like to ask is, “how do you think that works?” The most common answer I receive is probably the least plausible – that the tidal effects of the moon influence the brain because the brain is sitting in water (spinal fluid).
The tidal effect answer is incredibly implausible for a number of reasons.
By Mason I. Bilderberg
Before i forget …
This is a video i recently saw on a facebook webpage.
The video shows a large convoy of tractor trailer trucks traveling on Virginia’s Interstate 64 being escorted by State Troopers. Take a look:
As i watched the video i couldn’t think of why these trucks would be driving in such a formation (I’ve included the answer at the bottom of this post). I didn’t think much of it, really. Most people didn’t think much of it. That’s because when most people don’t know who, what, where, why or when, they simply say “I don’t know.” But not conspiracists …
When confronted with an unknown, conspiracists immediately fill their information void with something they want to believe (usually some kind of apocalyptic plan by lizard people to starve, kill, destroy and otherwise control earth people). It’s this ability by conspiracists to build a confirmation bias echo chamber out of absolutely nothing that i find really, really entertaining.
So now, for your entertainment, here are just a few of the comments i found associated with this video. Enjoy the lunacy.
So what is reality? Why were these trucks being escorted down a highway in Virginia? Read the government’s “cover story” here courtesy snopes.com.
Mason I. Bilderberg (MIB)
by Gordon Bonnet via Skeptophilia
What cracked me up about this one is the way the author of the story, Tom Rose, seems to take it as given that (1) chemtrails exist, and (2) UFOs exist, so clearly they must have some connection. Here’s how Rose introduces the topic:
An incredible UFO video was uploaded to YouTube on Oct. 12, showing what appears to be an unidentified flying object “refueling” itself in the chemical contrail of a jet flying high above it. The strange object, which resembles no known aircraft, and flies in a decidedly non-aerodynamic manner, seems to intentionally head directly into the chemtrail of the passing jet and hovers for a moment before moving on.
The “chemical contrails” of jets are made almost entirely of water vapor and carbon dioxide, so it’s a little hard to see how the UFO would be “refueling” itself with them. Rose is right, however, insofar as water vapor and carbon dioxide are both “chemicals.”
He goes on to write:
The nearly two minute video shows the original, unedited footage, without enhancement, in the opening segment, before switching to a magnified and slowed down version, which doesn’t help to clear up the mysterious behavior of the unidentified aircraft. In fact, the closeup reveals that, although there seems to be a flashing, navigational beacon on the UFO, its shape and configuration resembles no known aircraft, such as a helicopter, airplane or even a drone.
True, and there’s a reason for that, which I’ll get to in a moment.
He finishes up with a bang:
The chemtrail controversy has been raging for a few years now, with conspiracy theorists arguing there must be some secret meaning behind their sudden proliferation. Could this incident explain the phenomenon? Is it possible that alien aircraft are using the chemical exhaust fumes of high flying aircraft to refuel spacecraft in Earth’s atmosphere?
The last sentence would be the odds-on favorite in a contest for the statement that caused the fastest simultaneous guffaw and facepalm. But let’s not be hasty, here.
By Benjamin Radford via LiveScience
Amazing coincidences happen all the time — but are they simply the product of random chance, or do they convey some hidden meaning? The answer may depend on whether you believe in synchronicity.
The term synchronicity was coined by Swiss psychiatrist Carl Jung (1875-1961). Jung had a strong belief in a wide variety of paranormal phenomenon, including psychic powers, astrology, alchemy, predictive dreams, UFOs and telekinesis (moving objects with the mind). He was also obsessed with numerology — the belief that certain numbers have special cosmic significance, and can predict important life events.
Jung’s concept of synchronicity is complicated and poorly defined, but can be boiled down to describing “meaningful coincidences.” The concept of synchronicity came to Jung during a period of mental illness in the early 1900s. Jung became convinced that everything in the universe is intimately connected, and that suggested to him that there must exist a collective unconscious of humankind. This implied to him that events happening all over the world at the same time must be connected in some unknown way.
In his book “137: Jung, Pauli, and the Pursuit of a Scientific Obsession,” Arthur I. Miller gives an example of synchronicity; one of his patients “told Jung that when her mother and grandmother died, on each occasion a flock of birds gathered outside the window of the room.” The woman’s husband, who had symptoms of heart problems, went out to see a doctor and “on his way back the man collapsed in the street. Shortly after he had set off to see the specialist a large flock of birds had alighted on the house. His wife immediately recognized this as a sign of her husband’s impending death.”
There is, of course, a more prosaic explanation for curious coincidence: birds are very common, and simply by random chance a flock will appear near people who are soon to die — just as they appear daily around millions of people who are not soon to die.
The appearance of synchronicity is the result of a well-known psychological phenomenon called confirmation bias (sometimes described as remembering the hits and forgetting the misses); we much more easily notice and remember things that confirm our beliefs than those that do not. The human brain is very good at making connections and seeing designs in ambiguous stimuli and random patterns.
If Jung’s patient came to believe that a flock of birds meant that death was imminent, she would start noticing flocks of birds, and remember the times when they coincided with a loved one’s death. But she would not likely notice or remember the countless times when flocks of birds appeared over people who lived for years or decades longer. Put another way, a person dying when a flock of birds is present is an event; a person not dying when a flock of birds is present is a non-event, and therefore not something anyone pays attention to. This is the result of normal human perceptual and memory biases, not some mysterious cosmic synchronicity.
It’s easy to see why synchronicity has mass appeal; it provides meaning and order in an otherwise random universe. One famous (and more modern) example of synchronicity is . . .
Last year I started putting up on this page one video per week.
Now I’ve had a lot of videos on here that were just great, and today I’ve decided to have a look back at what I consider to be the five best videos of the week for 2013:
5. Alex Jones As Alien Lizard Explains Obamacare
Probably every skeptic around the world knows who Alex Jones. While many skeptic bloggers have at least written up a couple of articles to either discredit him and/or show what kind of a fool he is, still by far the best person to discredit Alex Jones and to make him look like a fool… is Alex Jones.
This clip from Right Wing Watch’s Youtube page clearly shows why that’s true:
4. Debunking 9/11 conspiracy theorists part 6 of 7 – The psychology behind a 9/11 truther
From late 2012 to early 2013 Myles Power created a seven part series that is in my opinion one of the best 9/11 conspiracy theory debunking videos that I have ever seen, and the sixth video in the series, which explains the psychology and mindset of a 9/11 Truther, and infact most conspiracy theorists, could have itself been a stand alone video apart from the series.
While I did touch upon ten of what I considered to be biggest lies, I still felt there were more lies that people in the 9/11 Truth Movement promoted that still needed to be addressed.
So, I have put together another list of ten more lies that Truther tells:
10. Nothing hit World Trade Center 7.
Actually something did hit World Trade Center 7… a skyscraper.
To be more precise falling debris from World Trade Center 1 hit World Trade Center 7 and caused huge amounts of damage to the lower floors of the building. The combination of that, and the fact that the building had been on fire for hours caused the building to collapse.
9. Only two buildings were hit, but three were destroyed.
This is not true. In fact more than three building were destroyed that day. World Trade Center 3, 4, 5, and 6 were heavily damaged that day and what was left of them had to be torn down because they could not be repaired.
Also, many other buildings around the World Trade Center were damaged as well.
8. A nuclear bomb brought down the towers.
If this was true then this would be the easiest one to prove, as all you would have to do is go down to the World Trade Center site with a Geiger counter and one would easily find large amounts of radiation there.
Also, lower Manhattan would be uninhabitable right now due to that radiation, plus the destruction caused would have been far greater, and a lot more people would have died, either from the initial blast from the weapon, or from the radiation and radioactive fall out.
Plus, there would have been an obvious flash some what similar to the Sun when the device went, and there would have been no way to hide that.
7. The towers were reduced to dust and gravel.
Primarily promoted by followers of Judy Wood and those that believe in her theory that the towers were brought down high energy lasers, their claims are that the towers were reduced to dust and gravel by these alleged lasers.
While the collapse of the towers did create a lot of dust and gravel, it also left large chunks of concrete, long pieces of steel beams, and even places where pieces of the outer wall several stories high still stood.
6. Israel did it.
Besides the fact that there is no evidence what so ever that Israel did this, the fact is that Israel had no reason to do something like this.
The United States is Israel’s biggest supporter, and President George W. Bush was one of Israel’s strongest supporters at that.
To simply put, the people in charge of Israel would have had to have lost their minds to have done something like that. Not only would they have been risking losing support from the United States, but also risked going war with the United States in order to get more support from the United States.
This is volume 1 of The Con Academy videos—another resource in the Skeptics Society‘s arsenal of Skepticism 101 for teaching critical thinking and promoting science through the use of humor, wit, and satire. In this faux commercial for The Con Academy you’ll see how psychics count on the confirmation bias to convince people that their powers are real when, in fact, they are just remembering the hits and forgetting the misses. We also demonstrate how psychic “organizations” con people by taking their money for services that are not real.
The human brain is capable of 1016 processes per second, which makes it far more powerful than any computer currently in existence. But that doesn’t mean our brains don’t have major limitations. The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless — plus, we’re subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions. Here are a dozen of the most common and pernicious cognitive biases that you need to know about.
Before we start, it’s important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).
Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them. Here are some important ones to keep in mind.
We love to agree with people who agree with us. It’s why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes. We tend to be put off by individuals, groups, and news sources that make us feel uncomfortable or insecure about our views — what the behavioral psychologist B. F. Skinner called cognitive dissonance. It’s this preferential mode of behavior that leads to the confirmation bias — the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter how valid — that threaten our world view. And paradoxically, the internet has only made this tendency even worse.
Somewhat similar to the confirmation bias is the ingroup bias, a manifestation of our innate tribalistic tendencies. And strangely, much of this effect may have to do with oxytocin — the so-called “love molecule.” This neurotransmitter, while helping us to forge tighter bonds with people in our ingroup, performs the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others. Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know.
It’s called a fallacy, but it’s more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they’ll somehow influence future outcomes. The classic example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%.
Relatedly, there’s also the positive expectation bias — which often fuels gambling addictions. It’s the sense that our luck has to eventually change and that good fortune is on the way. It also contribues to the “hot hand” misconception. Similarly, it’s the same feeling we get when we start a new relationship that leads us to believe it will be better than the last one.
Remember that time you bought something totally unnecessary, faulty, or overly expense, and then you rationalized the purchase to such an extent that you convinced yourself it was a great idea all along? Yeah, that’s post-purchase rationalization in action — a kind of built-in mechanism that makes us feel better after we make crappy decisions, especially at the cash register. Also known as Buyer’s Stockholm Syndrome, it’s a way of subconsciously justifying our purchases — especially expensive ones. Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.
Very few of us have a problem getting into a car and going for a drive, but many of us experience great trepidation about stepping inside an airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly unnatural and seemingly hazardous activity. Yet virtually all of us know and acknowledge the fact that the probability of dying in an auto accident is significantly greater than getting killed in a plane crash — but our brains won’t release us from this crystal clear logic (statistically, we have a 1 in 84 chance of dying in a vehicular accident, as compared to a 1 in 5,000 chance of dying in an plane crash [other sources indicate odds as high as 1 in 20,000]). It’s the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.
This is what the social psychologist Cass Sunstein calls probability neglect — our inability to properly grasp a proper sense of peril and risk — which often leads us to overstate the risks of relatively harmless activities, while forcing us to overrate more dangerous ones.
- Meet the Sandy Hook truthers (illuminutti.com)
- Sicko Conspiracy Sociopaths Harass Man Who Sheltered Kids During Sandy Hook Massacre (theageofblasphemy.wordpress.com)
- The worst Sandy Hook conspiracy theory yet (salon.com)
- KTH: Newtown harassed by conspiracy theorists (ac360.blogs.cnn.com)
- Anderson Cooper Tackles ‘Sickening,’ ‘Ignorant’ Newtown Conspiracy Theories (mediaite.com)
- Professor James Tracy says he’s facing university probe over Newtown conspiracy (rawstory.com)
- Exposing Newtown conspiracy theory (illuminutti.com)
- Anderson Cooper Goes After ‘Anonymous Internet Trolls’ Pushing Conspiracies About Newtown And Gun Control (mediaite.com)
- Newtown families targeted, intimidated (ac360.blogs.cnn.com)
Originally posted on Anderson Cooper 360:
Anderson Cooper speaks with Salon.com reporter Alex Seitz-Wald, who has been covering the Sandy Hook conspiracy theories, and Jordan Ghawi, whose sister was killed in the Aurora, Colorado movie theater shooting. Ghawi was targeted by conspiracy theorists days after Jessica died.
Seitz-Wald researched conspiracy theories in the U.S. and found most followers are “inclined to believe the government is out to get them” and is collaborating with the media. “They have this confirmation bias, as psychologists call it, to look for only evidence that supports their theories and disregard anything that says otherwise,” he says.
View original 150 more words
via NeuroLogica Blog
In just about every disaster or event in which there are many deaths, such as a plane crash, there is likely to be, by random chance alone, individuals who survived due to an unlikely sequence of events. Passengers missing their flight by a few minutes can look back at all the small delays that added up to them seeing the doors close as they a jog up to their gate. If that plane were then to crash, killing everyone on board, those small delays might seem like destiny. The passenger who canceled their flight because of flying anxiety might feel as if they had a premonition.
This is nothing but the lottery fallacy – judging the odds of an event occurring after the fact. What are the odds of one specific person winning the lottery? Hundreds of millions to one against. What are the odds of someone winning the lottery? Very good.
Likewise, what are the chances that someone will miss or choose not to take any particular flight? Very high – therefore this is likely to be true about any flight that happens to crash. If you are that one person, however, it may be difficult to shake the sense that your improbable survival was more than just a lucky coincidence.
A similar story has emerged from the Sandy Hook tragedy. A mother of a kindergartener there, Karen Dryer claims that her 5 year old son was saved by his psychic powers. She reports that her son, after a few months at the school, started to cry and be unhappy at school. He was home schooled for a short time, during which the shooting occurred. Now, at the new elementary school that recently opened, he seems to be happy.
In retrospect it may seem like a compelling story – if one does not think about it too deeply. As Ben Radford points out in the article linked to above, the story as told is likely the product of confirmation bias. The mother is now remembering details that enhance the theme of the story (her son’s alleged psychic powers) and forgetting details that might be inconsistent.
MORE . . .
via NeuroLogica Blog
It is … not enough to have a generally skeptical outlook, or even to call oneself a skeptic. Skepticism is a journey of self-knowledge, exploration, and mastering the various skills that comprise so-called metacognition – the ability to think about thinking.
As an example of the need for metacognitive skills in navigating this complex world there is confirmation bias. This is definitely on my top 5 list of core skeptical concepts, and is a major contributor to faulty thinking. Confirmation bias is the tendency to perceive and accept information that seems to confirm our existing beliefs, while ignoring, forgetting, or explaining away information that contradicts our existing beliefs. It is a systematic bias that works relentlessly and often subtly to push us in the direction of a desired or preexisting conclusion or bias. Worse – it gives us a false sense of confidence in that conclusion. We think we are following the evidence, when in fact we are leading the evidence.
Part of the illusion of evidence created by confirmation bias is the fact that there is so much information out there in the world. We encounter numerous events, people, and bits of data every day. Our brains are great at sifting this data for meaningful patterns, and when we see the pattern we think, “What are the odds? That cannot be a coincidence, and so it confirms my belief.” Rather, the odds that you would have encountered something that could confirm your belief was almost certain, given the number of opportunities.
Another factor that plays into confirmation bias is using open-ended criteria, or ad-hoc or post-hoc analysis. This means that we decide after we encounter a bit of information that this information confirms our belief. We retrofit the new data into our belief as confirmation.
Confirmation bias is further supported by a network of cognitive flaws – logical fallacies, heuristics, and other cognitive biases – that conspire together to reinforce our existing beliefs. In the end you have people who, based on the same underlying reality, arrive at confidently and firmly held conclusions that are directly opposing and mutually exclusive.
I encounter examples of confirmation bias every day. (My now favorite quote about this is from Jon Ronson, who said, “After I learned about confirmation bias I started seeing it everywhere.”) Of course, at first it is easy to see confirmation bias in others, and only later do we learn to detect it in ourselves, which forever remains challenging.
MORE . . .
Confirmation bias refers to a type of selective thinking whereby one tends to notice and look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs. For example, if you believe that during a full moon there is an increase in admissions to the emergency room where you work, you will take notice of admissions during a full moon but be inattentive to the moon when admissions occur during other nights of the month. A tendency to do this over time unjustifiably strengthens your belief in the relationship between the full moon and accidents and other lunar effects.
This tendency to give more attention and weight to data that support our beliefs than we do to contrary data is especially pernicious when our beliefs are little more than prejudices. If our beliefs are firmly established on solid evidence and valid confirmatory experiments, the tendency to give more attention and weight to data that fit with our beliefs should not lead us astray as a rule. Of course, if we become blinded to evidence truly refuting a favored hypothesis, we have crossed the line from reasonableness to closed-mindedness.
Numerous studies have demonstrated that people generally give an excessive amount of value to confirmatory information, that is, to positive or supportive data. The “most likely reason for the excessive influence of confirmatory information is that it is easier to deal with cognitively” (Thomas Gilovich, How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life). It is much easier to see how data support a position than it is to see how they might count against the position. Consider a typical ESP experiment or a seemingly clairvoyant dream: Successes are often unambiguous or data are easily massaged to count as successes, while negative instances require intellectual effort to even see them as negative or to consider them as significant. The tendency to give more attention and weight to the positive and the confirmatory has been shown to influence memory. When digging into our memories for data relevant to a position, we are more likely to recall data that confirms the position.
Researchers are sometimes guilty of confirmation bias by setting up experiments or framing their data in ways that will tend to confirm their hypotheses.
The illusion of understanding occurs frequently due to selection bias and confirmation bias. By selecting only data that support one’s position and ignoring relevant data that would falsify or compromise one’s position, one can produce a convincing but misleading argument. By seeking only examples that confirm one’s belief and by ignoring examples that disconfirm it or reveal the insignificance of the data you’ve put forth, one can easily create the illusion of understanding. The illusion of understanding is particularly prominent in the field of economic forecasting.
Think about it. If stock analysts could really beat the market consistently, wouldn’t they be stinking rich? Do you really think they are a clan of benevolent elves whose only goal is to help people like you get rich from their technical advice? Their cousins appear in infomercials all the time, telling stories about unfathomable riches that await you if you invest in their program. That’s how they make their money: not by using their program, but by selling it to others!
Skeptic’s definition of the day …
Confirmation bias refers to a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs.
Read all about it: confirmation bias – The Skeptic’s Dictionary – Skepdic.com.
See also: Confirmation Bias