Tag Archives: Critical thinking

Kill ChemTrails With Vinegar!!!!!

Finally! A solution to ChemTrails and ChemClouds!!!! Ordinary vinegar!!! Vinegar dissolves ChemClouds and ChemTrails!!! Seeing is believing!!

Principles of Curiosity

Personally, I would give this video 3.5 out of 5 stars. It felt too lengthy (40 minutes) for the amount of information presented, but still very enjoyable.

Why do people join cults?

The first thought that came to mind was Scientology.

Critical thinking is one for the history books

A critical analysis of archeology leads to rejection of astrology, conspiracies, etc.

By via Ars Technica

The world as a whole has become increasingly reliant on science to provide its technology and inform its policy. But rampant conspiracy theories, fake news, and pseudoscience like homeopathy show that the world could use a bit more of the organized skepticism that provides the foundation of science. For that reason, it has often been suggested that an expanded science education program would help cut down on the acceptance of nonsense.

But a study done with undergrads at North Carolina State University suggests that a class on scientific research methods doesn’t do much good. Instead, a class dedicated to critical analysis of nonsense in archeology was far more effective at getting students to reject a variety of pseudoscience and conspiracy theories. And it worked even better when the students got their own debunking project.

The study, done by Anne Collins McLaughlin and Alicia McGill, lumps together things like belief in astrology, conspiracy theories, and ancient aliens, calling them “epistemically unwarranted.” Surveys show they’re widely popular; nearly half the US population thinks astrology is either somewhat or very scientific, and the number has gone up over time.

You might think that education, especially in the sciences, could help reverse this trend, but McLaughlin and McGill have some depressing news for you. Rejection of epistemically unwarranted ideas doesn’t correlate with scientific knowledge, and college students tend to have as much trouble coming to grips with reality as anyone else.

Continue Reading @ Ars Technica – – –

Here Be Dragons (Brian Dunning)

Here Be Dragons is a 40 minute video introduction to critical thinking. This video is on my “must watch” list for skeptics and critical thinkers 🙂

Most people fully accept paranormal and pseudoscientific claims without critique as they are promoted by the mass media. Here Be Dragons offers a toolbox for recognizing and understanding the dangers of pseudoscience, and appreciation for the reality-based benefits offered by real science.

Here Be Dragons is written and presented by Brian Dunning, host and producer of the Skeptoid podcast and author of the Skeptoid book series.

Source: Here Be Dragons – YouTube.

Critical Thinking

Fun stuff.

Critical Thinking – YouTube.

Can You Solve This?

I found this to be a great lesson in critical thinking. Check it out 🙂


Via Can You Solve This? – YouTube

How do you investigate hypotheses? Do you seek to confirm your theory – looking for white swans? Or do you try to find black swans? I was startled at how hard it was for people to investigate number sets that didn’t follow their hypotheses, even when their method wasn’t getting them anywhere.

This video was inspired by The Black Swan by Nassim Taleb and filmed by my mum. Thanks mum!

A Magical Journey through the Land of Reasoning Errors

Four common types of analytical errors in reasoning that we all need to beware of.

Brian DunningBy Brian Dunning via skeptoid
Read transcript below or Listen here.

Today we’re going to cover a bit of new ground in the basics of critical thinking and critical reasoning. There are several defined types of common analytical errors to which we’re all prone; some, perhaps, more so than others. Reasoning errors can be made accidentally, and some can even be made deliberately as a way to influence the acceptance of ideas. We’re going to take a close look at the Type I false positive error, the Type II false negative error, the Type III error of answering the wrong question, and finally the dreaded Type IV error of asking the wrong question.

By way of example we’ll apply these errors to three hypothetical situations, all of which should be familiar to fans of scientific skepticism:

  1. From the realm of the paranormal, a house is reported to be haunted. The null hypothesis is that there is no ghost, until we find evidence that there is.
  2. The conspiracy theory that the government is building prison camps in which to orderly dispose of millions of law-abiding citizens. The null hypothesis is that there are no such camps, until we find evidence of them.
  3. And from alternative medicine, the claim that vitamins can cure cancer. The null hypothesis is that they don’t, unless it can be proven through controlled testing.

So let’s begin with:

Type I Error: False Positive

type I errorA false positive is failing to believe the truth, or more formally, the rejection of a true null hypothesis — it turns out there’s nothing there, but you conclude that there is. In cases where the null hypothesis does turn out to be true, a Type I error incorrectly rejects it in favor of a conclusion that the new claim is true. A Type I error occurs only when the conclusion that’s made is faulty, based on either bad evidence, misinterpreted evidence, an error in analysis, or any number of factors.

In the haunted house, Type I errors are those that occur when the house is not, in fact, haunted; but the investigators erroneously find that it is. They may record an unexplained sound and wrongly consider that to be proof of a ghost, or they may collect eyewitness anecdotes and wrongly consider them to be evidence, or they may have a strange feeling and wrongly reject all other possible causes for it.

The conspiracy theorist commits a Type I error when the government is not, in fact, building prison camps to exterminate citizens, but he comes across something that makes him reject that null hypothesis and conclude that it’s happening after all. Perhaps he sees unmarked cars parked outside a fenced lot that has no other apparent purpose, and wrongly considers that to be unambiguous proof, or perhaps he watches enough YouTube videos and decides that so many other conspiracy theorists can’t be all wrong. Perhaps he simply hates the government, so he automatically accepts any suggestion of their evildoing.

Finally, the alternative medicine hopeful commits a Type I error when he concludes that vitamins successfully treat a cancer that they actually don’t. Perhaps he hears enough anecdotes or testimonials, perhaps he is mistrustful of medical science and erroneously concludes that alternative medicine must therefore work, or whatever his thought process is; but an honest conclusion that the null hypothesis has been proven false is a classic Type I error.

Type II Error: False Negative

type II errorCynics are those who are most often guilty of the Type II error, the acceptance of the null hypothesis when it turns out to actually be false — it turns out that something is there, but you conclude that there isn’t. If you actually do have psychic powers but I am satisfied that you do not, I commit a Type II error. The villagers of the boy who cried “Wolf!” commit a Type II error when they ignore his warning, thinking it false, and lose their sheep to the wolf. The protohuman who hears a rustling in the grass and assumes it’s just the wind commits a Type II error when the panther springs out and eats him.

Perhaps somewhere there is a house that actually is haunted, and maybe the TV ghost hunters find it. If I laugh at their silly program and dismiss the ghost, I commit a Type II error. If it were to transpire that the government actually is implementing plans to exterminate millions of citizens in prison camps, then everyone who has not been particularly concerned about this (myself included) has made a Type II error. The invalid dismissal of vitamin megadosing would also be a Type II error if it turned out to indeed cure cancer, or whatever the hypothesis was.

Type I and II errors are not limited to whether we believe in some pseudoscience; they’re even more applicable in daily life, in business decisions and research. If I have a bunch of Skeptoid T-shirts printed to sell at a conference, I make a Type I error by assuming that people are going to buy, and it turns out that nobody does. The salesman makes a Type II error when he decides that no customers are likely to buy today, so he goes home early, when in fact it turns out that one guy had his checkbook in hand.

Both Type I and II errors can be subtle and complex, but in practice, the Type I error can be thought of as excess idealism, accepting too many new ideas; and the Type II error as excess cynicism, rejecting too many new ideas.

Before talking about Type III and IV errors, it should be noted that these are not universally accepted. Types I and II have been standard for nearly a century, but various people have extended the series in various directions since then; so there is no real convention for what Types III and IV are. However the definitions I’m going to give are probably the most common, and they work very well for the purpose of skeptical analysis.

MORE – – –


Via QualiaSoup – YouTube

A look at some of the flawed thinking that prompts people who believe in certain non-scientific concepts to advise others who don’t to be more open-minded.

Skeptic Presents: Get Your Guru Going

Via Skeptic Magazine

In this video — the fifth in our series of videos that promote science and critical thinking through the use of humor, wit, and satire — we present a Con Academy mini course in the techniques of New Age Spiritual Gurutry.

If you missed our first four videos, check them out:

%d bloggers like this: