Tag Archives: Critical thinking

Kill ChemTrails With Vinegar!!!!!

Finally! A solution to ChemTrails and ChemClouds!!!! Ordinary vinegar!!! Vinegar dissolves ChemClouds and ChemTrails!!! Seeing is believing!!

Principles of Curiosity

Personally, I would give this video 3.5 out of 5 stars. It felt too lengthy (40 minutes) for the amount of information presented, but still very enjoyable.

Why do people join cults?

The first thought that came to mind was Scientology.

Critical thinking is one for the history books

A critical analysis of archeology leads to rejection of astrology, conspiracies, etc.

By via Ars Technica

The world as a whole has become increasingly reliant on science to provide its technology and inform its policy. But rampant conspiracy theories, fake news, and pseudoscience like homeopathy show that the world could use a bit more of the organized skepticism that provides the foundation of science. For that reason, it has often been suggested that an expanded science education program would help cut down on the acceptance of nonsense.

But a study done with undergrads at North Carolina State University suggests that a class on scientific research methods doesn’t do much good. Instead, a class dedicated to critical analysis of nonsense in archeology was far more effective at getting students to reject a variety of pseudoscience and conspiracy theories. And it worked even better when the students got their own debunking project.

The study, done by Anne Collins McLaughlin and Alicia McGill, lumps together things like belief in astrology, conspiracy theories, and ancient aliens, calling them “epistemically unwarranted.” Surveys show they’re widely popular; nearly half the US population thinks astrology is either somewhat or very scientific, and the number has gone up over time.

You might think that education, especially in the sciences, could help reverse this trend, but McLaughlin and McGill have some depressing news for you. Rejection of epistemically unwarranted ideas doesn’t correlate with scientific knowledge, and college students tend to have as much trouble coming to grips with reality as anyone else.

Continue Reading @ Ars Technica – – –

Here Be Dragons (Brian Dunning)

Here Be Dragons is a 40 minute video introduction to critical thinking. This video is on my “must watch” list for skeptics and critical thinkers 🙂

Most people fully accept paranormal and pseudoscientific claims without critique as they are promoted by the mass media. Here Be Dragons offers a toolbox for recognizing and understanding the dangers of pseudoscience, and appreciation for the reality-based benefits offered by real science.

Here Be Dragons is written and presented by Brian Dunning, host and producer of the Skeptoid podcast and author of the Skeptoid book series.

Source: Here Be Dragons – YouTube.

Critical Thinking

Fun stuff.

Critical Thinking – YouTube.

Can You Solve This?

I found this to be a great lesson in critical thinking. Check it out 🙂


Via Can You Solve This? – YouTube

How do you investigate hypotheses? Do you seek to confirm your theory – looking for white swans? Or do you try to find black swans? I was startled at how hard it was for people to investigate number sets that didn’t follow their hypotheses, even when their method wasn’t getting them anywhere.

This video was inspired by The Black Swan by Nassim Taleb and filmed by my mum. Thanks mum!

A Magical Journey through the Land of Reasoning Errors

Four common types of analytical errors in reasoning that we all need to beware of.

Brian DunningBy Brian Dunning via skeptoid
Read transcript below or Listen here.

Today we’re going to cover a bit of new ground in the basics of critical thinking and critical reasoning. There are several defined types of common analytical errors to which we’re all prone; some, perhaps, more so than others. Reasoning errors can be made accidentally, and some can even be made deliberately as a way to influence the acceptance of ideas. We’re going to take a close look at the Type I false positive error, the Type II false negative error, the Type III error of answering the wrong question, and finally the dreaded Type IV error of asking the wrong question.

By way of example we’ll apply these errors to three hypothetical situations, all of which should be familiar to fans of scientific skepticism:

  1. From the realm of the paranormal, a house is reported to be haunted. The null hypothesis is that there is no ghost, until we find evidence that there is.
  2. The conspiracy theory that the government is building prison camps in which to orderly dispose of millions of law-abiding citizens. The null hypothesis is that there are no such camps, until we find evidence of them.
  3. And from alternative medicine, the claim that vitamins can cure cancer. The null hypothesis is that they don’t, unless it can be proven through controlled testing.

So let’s begin with:

Type I Error: False Positive

type I errorA false positive is failing to believe the truth, or more formally, the rejection of a true null hypothesis — it turns out there’s nothing there, but you conclude that there is. In cases where the null hypothesis does turn out to be true, a Type I error incorrectly rejects it in favor of a conclusion that the new claim is true. A Type I error occurs only when the conclusion that’s made is faulty, based on either bad evidence, misinterpreted evidence, an error in analysis, or any number of factors.

In the haunted house, Type I errors are those that occur when the house is not, in fact, haunted; but the investigators erroneously find that it is. They may record an unexplained sound and wrongly consider that to be proof of a ghost, or they may collect eyewitness anecdotes and wrongly consider them to be evidence, or they may have a strange feeling and wrongly reject all other possible causes for it.

The conspiracy theorist commits a Type I error when the government is not, in fact, building prison camps to exterminate citizens, but he comes across something that makes him reject that null hypothesis and conclude that it’s happening after all. Perhaps he sees unmarked cars parked outside a fenced lot that has no other apparent purpose, and wrongly considers that to be unambiguous proof, or perhaps he watches enough YouTube videos and decides that so many other conspiracy theorists can’t be all wrong. Perhaps he simply hates the government, so he automatically accepts any suggestion of their evildoing.

Finally, the alternative medicine hopeful commits a Type I error when he concludes that vitamins successfully treat a cancer that they actually don’t. Perhaps he hears enough anecdotes or testimonials, perhaps he is mistrustful of medical science and erroneously concludes that alternative medicine must therefore work, or whatever his thought process is; but an honest conclusion that the null hypothesis has been proven false is a classic Type I error.

Type II Error: False Negative

type II errorCynics are those who are most often guilty of the Type II error, the acceptance of the null hypothesis when it turns out to actually be false — it turns out that something is there, but you conclude that there isn’t. If you actually do have psychic powers but I am satisfied that you do not, I commit a Type II error. The villagers of the boy who cried “Wolf!” commit a Type II error when they ignore his warning, thinking it false, and lose their sheep to the wolf. The protohuman who hears a rustling in the grass and assumes it’s just the wind commits a Type II error when the panther springs out and eats him.

Perhaps somewhere there is a house that actually is haunted, and maybe the TV ghost hunters find it. If I laugh at their silly program and dismiss the ghost, I commit a Type II error. If it were to transpire that the government actually is implementing plans to exterminate millions of citizens in prison camps, then everyone who has not been particularly concerned about this (myself included) has made a Type II error. The invalid dismissal of vitamin megadosing would also be a Type II error if it turned out to indeed cure cancer, or whatever the hypothesis was.

Type I and II errors are not limited to whether we believe in some pseudoscience; they’re even more applicable in daily life, in business decisions and research. If I have a bunch of Skeptoid T-shirts printed to sell at a conference, I make a Type I error by assuming that people are going to buy, and it turns out that nobody does. The salesman makes a Type II error when he decides that no customers are likely to buy today, so he goes home early, when in fact it turns out that one guy had his checkbook in hand.

Both Type I and II errors can be subtle and complex, but in practice, the Type I error can be thought of as excess idealism, accepting too many new ideas; and the Type II error as excess cynicism, rejecting too many new ideas.

Before talking about Type III and IV errors, it should be noted that these are not universally accepted. Types I and II have been standard for nearly a century, but various people have extended the series in various directions since then; so there is no real convention for what Types III and IV are. However the definitions I’m going to give are probably the most common, and they work very well for the purpose of skeptical analysis.

MORE – – –


Via QualiaSoup – YouTube

A look at some of the flawed thinking that prompts people who believe in certain non-scientific concepts to advise others who don’t to be more open-minded.

Skeptic Presents: Get Your Guru Going

Via Skeptic Magazine

In this video — the fifth in our series of videos that promote science and critical thinking through the use of humor, wit, and satire — we present a Con Academy mini course in the techniques of New Age Spiritual Gurutry.

If you missed our first four videos, check them out:

Emergency Handbook: What to Do When a Friend Loves Woo

How you can help a friend or loved one with a potentially harmful pseudoscientific belief

Brian DunningBY Brian Dunning via Skeptoid: Critical Analysis Podcast. Read podcast transcript below or listen here.

It’s the #1 most common question I get: My wife, my friend, my mom, my boss, is investing their health or their money in some magical or fraudulent product/scheme/belief. What can I do about it?

StephnMedShow_250pxThis is a tough situation to be in. Whether it’s a loved one who’s ill and is being taken advantage of by a charlatan selling a magical cure with no hope of treating the illness, or a friend who’s out of work and is going into deeper debt to buy into a hopeless multilevel marketing plan, it’s really hard to watch. The hardest is when they have a real problem and are expending their limited resources trying to solve it with a medieval, magic-based system that you know can’t possibly help. But all too often, they think it’s helping. Cognitive biases, anecdotal thinking, placebo effects and cognitive dissonance combine to build a powerful illusion that our brains are hardwired to believe in. At some point, it falls to a caring friend to try and rescue them with a candle of reason.

You’re up against a foe who’s far more formidable than you might think. This isn’t like settling a bet with a friend where you can look up the answer on Wikipedia, see who’s right, then buy each other a beer. You’re going after someone’s religion. You’re setting out to talk someone out of believing something that they know to be true, for a fact, from their personal experience. That right there makes your task nearly impossible, but it’s worse. Their belief has spiritual underpinnings that make it deeply moral and virtuous. Imagine if someone came to you and flashed a magazine article that said it’s best to turn your children out into the street and never talk to them again. It’s not only unconvincing, it’s laughable. Your effort to talk someone out of their belief in their sacred cow is likely to be just as laughable.

pick your battles_200pxSo what should you do, give up? You may be surprised to hear it from me, but I advise you to do just that, in many cases. Know which battles to fight. Weigh the risks. Consider the context of your friend’s belief: Is he in imminent danger of harming himself or others? Probably not; and if not, this may not be the time to take what might be your only shot. So I want to make this a rule: Before you decide what to do, consider the risks and the context. How terrible are the consequences of your friend’s belief? Think that through comprehensively. Make sure you have a good understanding of the risks to your friend if you do nothing, and the risks to your relationship if you attack their beliefs and (in all probability) fail to convince them. It may well be that this first strategy I’m going to present is the safest.

Strategy #1: Do Nothing

acceptingDoing nothing now doesn’t mean giving up. When you choose not to confront your friend’s current weird belief, there’s still an effective strategy for helping him out that you can follow. By accepting and tolerating your friend’s weird belief, you’re actually setting yourself up to be in a position of great influence the next time something weird comes down the line. Your friend likely knows that you’re a skeptical person, and eventually he’ll recognize that you’ve been putting up with his weird belief and saying nothing. In fact he may someday ask you, “Hey, you know I believe in this weird thing, how come Mr. Cynical Skeptic has never tried to talk me out of it?”

Ask “Is it important to you?”


“You’re important to me.”

Think what a powerful message that sends. It may sound corny, but it’s a statement that your friend will always remember. You’ve just communicated that your friendship is more important than your “evil debunking hobby”. You’ve made it clear, unequivocably, that you don’t want such differences to come between you.

And now look at the position you’re in. You’re trusted. You’re an ally at the most important and fundamental level. This is exactly where you need to be if you want to be influential on someone. You can now begin to introduce critical thinking using topics that are more about exploration than confrontation, and this is a journey you should take together. critical-thinking1_250pxNext time you’re in the car together, play a few Skeptoid episodes. Play episodes like The Baigong Pipes, Is He Real or Is He Fictional, The Missing Cosmonauts, and When People Talk Backwards. Topics such as these do not attack or challenge anyone, they instill an appreciation and a passion for the value of critical thinking. Once introduced, I find that most people want more.

Gather every bit of skeptical material you can find that you know will interest your friend, and that does not attack or challenge his belief. So long as you remain a trustworthy friend and not an irrational adversary, you’re in a position to introduce him to the fundamentals of critical thinking, and to the value and tangible rewards of reality. Don’t underestimate the value of seeds that are well planted in a good environment. If your friend comes around on his own, his growth is far more complete than any that’s forced upon him.

Always remember the story of the little boy who couldn’t get his pet turtle to come out of his shell. He tried to pull on its head, he shook it, he squirted water, he did everything he could think of. But the turtle wouldn’t come out. Then his grandfather took the turtle and placed it on the warm hearth, and within a minute the turtle was out of his shell. The little boy never forgot that lesson.

Strategy #2: The Intervention

Sometimes the situation is urgent and you don’t have time to do things the easy way. There might be a medical crisis, an emotional crisis, or a financial crisis, and an immediate intervention is needed. Sometimes a friend’s situation is dire enough that helping him is worth the loss of the personal relationship. In these cases, and probably only in these cases, would I suggest a confrontational approach. And to do this effectively, draw on the established principals of the counseling intervention.

interventionFirst you want to gather a group of friends or family, and you need to meet with them separately. Try to get a group, but even if there are only two of you, it’s worlds better than just you by yourself. Your next task is to present your evidence to the group that the magical system your friend is relying on is pseudoscientific and cannot help him. Do not expect them to accept what you say at face value, and do expect that some of them might buy into the magical system as well. Be prepared. Show your work. Print out pages from the web. Use the Science Based Medicine blog, use Skeptoid, use Quackwatch, use Swift. Search the best sources and have all your ducks in a row. The most important thing you need to do at this stage is to be certain that everyone in the group is united in their understanding of the useless, pseudoscientific nature of the magical sacred cow.

MORE – – –

On a related note . . .

Susan Blackmore – Fighting the Fakers (and Failing) – TAM 2013:

Susan Blackmore is a psychologist and writer researching consciousness, memes, and anomalous experiences, and a Visiting Professor at the University of Plymouth. She is the is author of a number of books, including The Meme Machine and Zen and the Art of Consciousness.

“You Know You Are a Conspiracy Theorist If…”

A deceptive test to make people believe they are a conspiracy theorist.

by via The Soap Box

A few months ago I came across the You Know You Are a Conspiracy Theorist If… test (which I found to be laughable when I saw it) to help a person tell if they are a conspiracy theorist or not (view the test here).

I have some things to say about this “test” and some comments about “questions” that were asked (well, they’re not really questions) as well as a few questions of my own:

critical-thinking1_250px• You are capable of critical thinking.

This is a paradox. If a conspiracy theorist was capable of critical thinking, then they wouldn’t be a conspiracy theorist because people who are capable of critical thinking would figure out that a conspiracy theory was BS.

• You distrust mainstream media.

So do most skeptics, although for entirely different reasons than conspiracy theorists do.

• You like nature.

Lots of people do. What does this have to do with being a conspiracy theorist?

• You think it’s a good idea to spend the Friday after Thanksgiving with your family rather than camping outside Best Buy to get a cheap plasma television made in China.

That doesn’t make you a conspiracy theorist. That makes you someone who is smart enough not to waste their time in the cold waiting for some store to open in the hope of finding bargains.

Screen Shot 2013-10-31 at 8.58.15 PM• You think it’s a little strange that WTC building 7 came down at free fall speed on 9/11 yet it was never hit by a plane.

This might make you a conspiracy theorist, as well as someone who has conveniently forgotten that WTC7 was hit by something… a skyscraper.

• You think that drones in America might not be for Al Qaeda.

This might also make you a conspiracy theorist… or it might make you someone who knows drones that fly over America are also used for multiple benign purposes.

• You would like to be able to get on a plane without having to engage in a mandatory radiation bath and digital strip search.

As do many Americans, especially those who have gone through that process.

• You have read a book in the past year.

What does reading a book have to do with being a conspiracy theorist?

thefirstamendment_250px• You think you have the right to protest.

According to the first Amendment I don’t think I have the right, I have the right, period!

• You think the War on Terror is a scam.

That depends on what your definition of “scam” is.

• You think the War on Drugs is a scam.

Again, that depends on what your definition of “scam” is. Does your definition mean completely bogus and fraudulent, or wasteful and unnecessary?

• You think the anger directed at America from the Middle East could possibly be related to our foreign policy rather than hating how amazingly free we are.

This just means you’ve done more than five minutes worth of research about the Middle East.

• You think the Republicans and Democrats are exactly the same on the important issues affecting our country.

This could mean you’re a conspiracy theorist… it could also mean that you’re a Libertarian, or you’re just ticked off at both political parties.

• You think believing in The Constitution does not constitute a terrorist act.

Who the Hell believes that believing in the constitution is a terrorist act? The only people who believe that are idiots!

bill-of-rights_250px• You have heard of the Bill of Rights and can even name what some of them are.

As most Americans have and can…

• You question whether the government loves you.

The government is not a living entity. It neither loves nor hates, therefore it is pointless to ask if it loves you or not.

• You think the right to bear arms is not for hunting, rather so citizens can fight back should the government become a bunch of tyrannical thugs.

Yeah, this could mean that you’re a conspiracy theorist… it could also mean that you just don’t like the government, or you’re afraid that the United States “could” become a tyrannical dictatorship.

• You don’t own a television, and if you do, all you watch is RT, especially the Keiser Report and Capital Account.

(Reading that alone makes me wonder if this is satire) If all you watch on television is RT (Russia Today) then there is no need to finish this test. You are a conspiracy theorist.

MORE – – –

Bad Thinking Makes Bad Things Happen

by Jamy Ian Swiss via Bad Thinking Makes Bad Things Happen

The Secret teaches that victims are always to blame, and that anyone can have anything simply by wishing.

“The Secret teaches that victims are always to blame, and that anyone can have anything simply by wishing.” – Brian Dunning

For a moment there that headline might seem like preaching the converse of “The Secret”, the toxically ignorant book promoted by the toxically ignorant Oprah. But this isn’t about the notion that thinking bad – or good – thoughts produces bad or good results. That notion is just plain dumb. (It’s also hateful because it inescapably claims that bad things happen to people because they don’t think good thoughts.)

What I mean by “bad thinking” here however is poor thinking – the inability to think critically, the inability to understand or effectively utilize science and scientific reasoning. And when that kind of bad thinking is in effect, then in fact, very bad things do happen. Not to mention: to good people. And their children.

This was evidenced yet again a few weeks ago when a study published in the journal “Pediatrics” provided further evidence that the 2010 pertussis (whooping cough) outbreak in California was partly the result of increased numbers of parents opting out of vaccinating their children.

Sometimes too much education, too much disposable income, too much free time and above all, too much good medicine and good health, can lead otherwise seemingly intelligent people to make appallingly ignorant and hazardous choices. That appears to be the case evidenced by the new study. According to a story at salon.com (quoting a report on NPR):

“… a community loses herd immunity after the vaccination rate drops below 95 percent. In 2010, only 91 percent of California kindergarteners were up to date on their shots. The researchers found that in some neighborhoods, especially those with high income and education levels, exemption rates were as high as 75 percent.”

The significant point to understand about herd immunity is that the greater percentage of vaccinated community members in turn helps protect infants, who are too young to be vaccinated, and anyone else unable to safely be given the vaccine, from contracting the disease.

Guess which child was vaccinated.

Guess which child was vaccinated.

A piece in “Scientific American” points out that, “Unvaccinated individuals in the 2010 epidemic were eight times more likely to contract pertussis than vaccinated ones. But unvaccinated individuals pose risks to the community as well. ‘It’s a choice you make for yourself and a choice you make for those around you,’ Offit [Paul Offit, director of the Vaccine Education Center at The Children’s Hospital of Philadelphia] says. “Infants need those around them to be protected in order not to get sick. We have a moral and ethical responsibility to our neighbors as well as to ourselves and our children.’”

So bad thinking does make bad things happen – and in this case, not just to the people doing the bad thinking, but to other people, and to other people’s children – and since I live in San Diego, my children are at risk thanks to that bad thinking. If you don’t think that science education and critical thinking skills are important, think again. If you don’t think the skeptic movement does important work, think again. If you don’t think that educating people about how to think about psychics and Bigfoot claims has a direct connection to the unnecessary medical risk my children face thanks to bad thinking – think again.

MORE . . .

Asking the Socratic Questions

A line of reasoning named for Socrates helps us help believers in the strange re-examine their beliefs.

Brian DunningBy Brian Dunningvia Asking the Socratic Questions

Read transcript below or listen here

SOCRATESOf all the possible perspectives, beliefs, theories, ideologies, and conclusions in this world, which of them are beyond question? None of them. And neither should be any person who holds one of those positions. People believe all sorts of strange things, and even though they might be passionate about them, most will still admit that questioning their belief is an appropriate undertaking. Therefore, we — as scientific skeptics — have an available avenue by which we can always encourage believers in the strange to revisit their beliefs. Despite the fact that we may lack professional expertise in the subject at hand, we can still plant the seeds of an uprising of logic within the mind of the believer. One way to do this is through the application of Socratic questioning.

Returning to our fake example guys used in the past, Starling and Bombo, we can illustrate this concept. Let us choose an example scenario. If Bombo has seen a UFO and believes that it was an alien spacecraft, it would likely be difficult for Starling to reason him out of the idea by offering alternative suggestions. People are often pretty stubborn when it comes to personal experiences that they’ve already interpreted for themselves; Bombo saw an alien spacecraft, and telling him it was the planet Venus would probably be a dead end. Indeed, even offering lines of logic for Bombo to follow on his own would probably be refused. So is there any effective way at all of getting someone to consider a different explanation?

wisdomquote_250pxThe answer is yes, and it involves getting Bombo to arrive at alternate explanations on his own. We’re all far more prone to accept our own ideas than someone else’s. Starling might well able to get Bombo to consider the idea that the UFO might not have been an alien spacecraft by employing Socratic questioning. Named (quite obviously) for Socrates — the ancient Greek philosopher (also quite obviously) — the Socratic questions are primarily teaching tools. Just as Bombo better accepts his own ideas, so do students of all types. Socratic questioning helps people to take a second, closer look at their own beliefs, and to apply critical thinking even when they least expect it.

There are six commonly described categories of Socratic questions, and they’re all good. You could familiarize yourself with any one of them, and you’d have a pretty good chance at changing Bombo’s mind, or that of anyone else who has made a conclusion based on faulty logic. An adept at all six types of questions would be a formidable reformer of popular pseudoscience believers.

Let’s begin with the first type:

MORE . . .

Skeptic Presents: You Can’t Handle The Truther

via ▶ Skeptic Presents: You Can’t Handle The Truther – YouTube.

In this video—the third in our series of videos that promote science and critical thinking through the use of humor, wit, and satire—CIA Agents plot the 9/11 attack on the Twin Towers and Pentagon.

If you missed our first two videos, check them out:

The Con Academy: http://youtu.be/eR_HlRDhUxY
B.Y.T.H Busters: The Secret Law of Attraction: http://youtu.be/Nf3BlmsTA8Q

The Con Academy (Vol.1)

This is volume 1 of The Con Academy videos—another resource in the Skeptics Society‘s arsenal of Skepticism 101 for teaching critical thinking and promoting science through the use of humor, wit, and satire. In this faux commercial for The Con Academy you’ll see how psychics count on the confirmation bias to convince people that their powers are real when, in fact, they are just remembering the hits and forgetting the misses. We also demonstrate how psychic “organizations” con people by taking their money for services that are not real.

MORE: Skeptic Presents: The Con Academy (Vol.1) – YouTube.


The Honest Liar – Homeopathy

Money for Nothing

by JREF Staff via randi.org

JREF senior fellow, magician and scientific skeptic Jamy Ian Swiss, “The Honest Liar”, presents JREF’s newest video series, aptly titled The Honest Liar. Follow Jamy as he uses critical thinking, skepticism, and a healthy dose of humor, along with his expertise in legerdemain, to explore the facts behind false claims.

In our first episode, “Money for Nothing”, Jamy punctures the pretense of homeopathy. How much is too much to pay for a remedy with nothing in it?

View on YouTube

The Periodic Table of Irrational Nonsense

This is some pretty funny stuff. Are you familiar with a periodic table? Well, this is the periodic table of irrational nonsense courtesy of Crispian Jago’s blog Science, Reason and Critical Thinking.

How does it work? Simply click on the image to be taken to the interactive page. At the interactive page you simply move your mouse over an element to view a short description.


Enjoy!     🙂

Clean Woo Table v1.4_600px

Click on the image to be taken to the interactive page.

Bible Code

by Crispian Jago via Science, Reason and Critical Thinking

Many conspiracy theorists seem very keen on the idea of hidden messages or codes secretly embedded within ancient writings. Believers claim hidden prophecies of significant world events and disasters can be uncovered and deciphered by analysing the Bible. By simply selecting a random paragraph and taking out the punctuation and merely inserting the passage into a matrix a skeptic, if suitably motivated, and with the benefit of hindsight is easily able to uncover whatever it is they fancy. Believers see predictions of the assassination of President Kennedy and the 9/11 twin towers terrorist attack uncovered in the bible as irrefutable evidence of divine revelation even though rational thinkers can locate predictions of the death of Leon Trotsky and Princess Diana secreted within “Moby Dick”.

Click image for larger view

Click image for larger view

via Science, Reason and Critical Thinking: Bible Code.

Spontanous Human Stupidity

Published by via NeuroLogica Blog

Spontaneous Human Combustion (SHC) is one of those classic pseudosciences that have been around for a long time – like astrology, Big foot, and the Bermuda Triangle. I put it in the same category as the myth that we only use about 10% of our brain capacity; it’s widely believed, but no one really cares that much. It’s just something people hear about and have no reason to doubt, so they lazily accept it. I did when I was younger (in my pre-skeptical days), you hear about it on TV and think, “Huh, isn’t that interesting.”

It’s therefore a good opportunity to teach critical thinking skills. People’s brains are clogged with myths and false information, spread by rumor and the media, and accepted due to a lack of having the proper critical thinking filters in place. It’s disappointing, however, when people who should know better, or whose job it is to know better, fall for such myths.

Recently an Irish coroner concluded that a man died from SHC, and it is reported:

The West Galway coroner, Ciaran McLoughlin, said there was no other adequate explanation for the death of Michael Faherty, 76, also known as Micheal O Fatharta.


The coroner said: “This fire was thoroughly investigated and I’m left with the conclusion that this fits into the category of spontaneous human combustion, for which there is no adequate explanation.”

First, let’s play a game of name-that-logical-fallacy. The core fallacy the coroner is committing is the argument from ignorance. The investigation could not find a cause for the fire, therefore here is the specific cause – SHC. The conclusion should rather be – we don’t know what caused the fire.

The coroner said the case “fits into the category” of SHC – but how?

Keep Reading: NeuroLogica Blog » Spontanous Human Stupidity.

%d bloggers like this: