Tag Archives: Fallacy

The 10 Commandments of Rational Debate (…Know Thy Logical Fallacies)

via Relatively Interesting

argue_250pxLooking for an edge so you can win your next big argument?

Learn the 10 Commandments of Rational Debate and use them against your enemy as you obliterate their argument point by point (rationally, of course).  Knowing your logical fallacies and how the brain can deceive even the brightest of minds is the first step towards winning an argument.

These are 10 of the more popular logical fallacies, but there are many others you need to learn in order to master the art of debate…

ten 10 commandments_600px

1. Though shall not attack the person’s character, but the argument itself. (“Ad hominem”)

Example:  Dave listens to Marilyn Manson, therefore his arguments against certain parts of religion are worthless. After all, would you trust someone who listens to that devil worshiper?

2. Though shall not misrepresent or exaggerate a person’s argument in order to make them easier to attack. (“Straw Man Fallacy”)

Example:  After Jimmy said that we should put more money into health and education, Steve responded by saying that he was surprised that Jimmy hates our country so much that he wants to leave it defenceless by cutting military spending.

3. Though shall not use small numbers to represent the whole. (“Hasty Generalization”)

Example:  Climate Change Deniers take a small sample set of data to demonstrate that the Earth is cooling, not warming. They do this by zooming in on 10 years of data, ignoring the trend that is present in the entire data set which spans a century.

4. Though shall not argue thy position by assuming one of its premises is true. (“Begging the Question”)


Sheldon: “God must exist.”
Wilbert: “How do you know?”
Sheldon: “Because the Bible says so.”
Wilbert: “Why should I believe the Bible?”
Sheldon: “Because the Bible was written by God.”
Wilbert: “WTF?”

Here, Sheldon is making the assumption that the Bible is true, therefore his premise – that God exists – is also true.

5. Though shall not claim that because something occurred before, but must be the cause. (“Post Hoc/False Cause”).

This can also be read as “correlation does not imply causation”.

Example:  There were 3 murders in Dallas this week and on each day, it was raining. Therefore, murders occur on rainy days.

MORE . . .

How to improve your odds at the casino by understanding the Gambler’s Fallacy

Via RelativelyInteresting.com

Imagine you are at a Las Vegas casino and you’re approaching the roulette table.  You notice that the last eight numbers were black… so you think to yourself, “Holy smokes, what are the odds of that!” and you bet on red, thinking that the odds of another black number coming up are really small.  In fact, you might think that the odds of another black coming up are:

0.5*0.5*0.5*0.5*0.5*0.5*0.5*0.5*0.5 = 0.00195 (a very tiny number)

Or are they?

casino_rouletteThe problem is that a roulette table – if fairly constructed – has no “memory”.  That is, one outcome does not depend on the previous outcome’s result, and so the odds for a red number or black number are just about equal (actually, just shy of 50% each, since there is one or two green spaces on a roulette table depending on American or European versions).

Keeping with our example, if you bet on either red or black for each spin, this type of outside bet pays 1 to 1 and covers 18 of the 38 possible combinations (or 0.474).  A far cry from the 0.00195 number above (a miscalculation that is roughly 243 times too small).  Now your odds of a red coming up aren’t so good anymore…

This fallacy is called the Gambler’s Fallacy, and it’s what the city of Las Vegas is built on.

Random events produce clusters like “8 black numbers in a row”, but in the long term, the probability of red or black will even out to its natural average.

The key to your success at the casino?  Understand that every individual spin (or “event”) has its own probability which never changes.  In this case, 18 in 38.

So the next time you’re at a casino and you see a string of the same color coming up, remember that the odds of that color coming up again are exactly the same as the other color… it might save you a few bucks so you can play a bit longer.

Source:  http://en.wikipedia.org/wiki/Gambler’s_fallacy

[END] RelativelyInteresting.com


The Curious Case of Correlation ≠ Causation

By marwanayache via Sensitive Context Blog

In my last post, I wrote about how not having enough contextual data can outright boggle the mind. Today, we’re going to read about something else that similarly boggles the mind, albeit not really related to any linguistic phenomena. It’s an interesting little logical fallacy in the field of statistics known as cum hoc ergo propter hoc, or more commonly, ‘correlation does not prove causation’.  Here, we define correlation as ‘when two things happen at the same time’, and causation as ‘when one thing causes the other’.

correlation causationThis logical fallacy is great at showing the glaring inaccuracies caused by lack of data on a specific subject, and how this lack can cause us to reach blindly for (often incorrect) conclusions in the proverbial fogginess of our mind. Additionally, the comedic factor here is amplified if you forego the law of parsimony (also known as Occam’s razor), which states that of all the possible solutions to a question or problem, the simplest one is most likely the truth.

Have you ever been in a recording booth or a really quiet place? If you’re in there for a long time your mind begins to create its own sounds. Essentially, you begin to hallucinate due to a lack of external stimuli. This is basically what goes on in the aforementioned logical fallacy: you end up compensating for a lack of data by drawing a perceived (and often inaccurate) connection between the sole items of data you have.

What does this have to do with a language blog? Essentially, it’s a great way of showing how a lack of the background information required for comprehension can yield wildly inaccurate knowledge. Dig this:

comparefeetDid you know that children with bigger feet are statistically better at spelling?  This is statistically true. Without additional contextual information, I could hypothesize that having larger foot-size means the children would perform better at sports and have better balance while carrying large and cumbersome schoolbags, making them less prone to falling over in bustling school hallways, making them less likely targets for bullies, leading to an inevitable increase in confidence, leading to better scholastic performance, and thereby, better spelling skills!

The truth is, it’s actually because children with larger feet are probably a lot older than children with smaller feet. Duh.

Did you know that you are more likely to get cancer if you always wear a seat-belt?

MORE . . .

Pragmatic Fallacy

Via The Skeptic’s Dictionary – Skepdic.com

pragmaticThe pragmatic fallacy is committed when one argues that something is true because it works and where ‘works’ means something like “I’m satisfied with it,” “I feel better,” “I find it beneficial, meaningful, or significant,” or “It explains things for me.” For example, many people claim that astrology works, acupuncture works, chiropractic works, homeopathy works, numerology works, palmistry works, therapeutic touch works. What ‘works’ means here is vague and ambiguous. At the least, it means that one perceives some practical benefit in believing that it is true, despite the fact that the utility of a belief is independent of its truth-value.

The pragmatic fallacy is common in “alternative” health claims and is often based on post hoc reasoning. For example, one has a sore back, wears the new magnetic or takionic belt, finds relief soon afterwards, and declares that the magic belt caused the pain to go away. How does one know this? Because it works! There is also some equivocation going on in the alternative health claims that fall under the heading of “energy medicine,” such as acupuncture and therapeutic touch. The evidence pointed to often uses ‘works’ in the sense of ‘the customer is satisfied’ or ‘the patient improves,’ but the conclusion drawn is that ‘chi was unblocked’ or ‘energy was transferred.’

There is a common retort to the skeptic who points out that customer satisfaction is irrelevant to whether the device, medicine, or therapy in question really is a significant causal factor in some outcome. Who cares why it works as long as it works? You can argue about the theory as to why it works, but you can’t argue about the customer satisfaction or the fact that measurable improvements can be made. That’s all that matters.

It isn’t all that matters. Testimonials are not a substitute for scientific studies, which are done to make sure that we are not deceiving ourselves about what appears to be true. It is especially necessary to . . .

. . . MORE . . .

Why calling someone a shill betrays the weakness of your position, and your inability to defend it.

Via Bad Skeptic

photo by aisletwentytwo on flickr. (CC).

photo by aisletwentytwo on flickr. (CC).

Running common among luddites, conspiracy theorists, and anti-science culture is the quick-draw to cry “shill!” That is, to accuse someone of taking a cash payment to espouse views that are not their own. Any statement made by the crier o’ shill’s opponent must be invalid, because they’re in the pay of the imaginary demons (big whatever, Monsanto, the Government, reptile aliens) who are trying to pull the wool over everyone’s eyes. How is this proven? By the fact that they oppose the crier o’ shill, of course.

I wonder sometimes if criers o’ shill even know why they’re reaching for this tautology as a conceptual shield. It may seem like a strong tactic to the person using it, because it feels useful to knock down any argument against them. But it’s merely a paradox–an unfalsifiable loop.

  1. Now that the cryer o’ shill has invalidated anything the ‘shill’ says, that person can no longer defend themselves, because they could be lying about not being a shill. Why do they lie? Because they’re a shill.
  2. The “evidence” for the person being a shill is the fact that they have a stance against the crier o’ shill. Calling someone a shill with no proof to back it up, is the logical equivalent of saying, “You are wrong, because I am never wrong.

Not only is the shill argument empty and sad, it’s one of the most common mistakes made by people in any argument. It’s a logical fallacy, known as an ad hominem. (Against the person.) The crier o’ shill is attacking the person making a statement in an attempt to render that target’s argument invalid, rather than demonstrating any falsehood in the argument through attacking the argument itself.

Of course you would say there’s no scientific proof that GMOs are dangerous! You’re getting paid by Monsanto!

Not only is the crier o’ shill . . .

. . . MORE . . .

Red Flags Of Quackery

Created by Maki at Sci-ence, the Red Flags Of Quackery inforgraphic below lays out many of the gambits and logical fallacies you may encounter by charlatans and true believers.

(click image for larger view)


(click image for larger view)

Logical Fallacies

Ever hear someone argue a point that was effective, even though it didn’t quite ring true? Chances are they used a logical fallacy.

Each video is only about 3 minutes long. Enjoy🙂

Part 1:

Part 2:

Part 3:

Three great websites run by Brian Dunning (in the videos above) that all skeptical thinkers ought to have bookmarked:

Mason I. Bilderberg (MIB)

Attribution Biases

via Unnatural Acts that can improve your thinking

head clouds_250pxHuman behavior can be understood as issuing from “internal” factors or personal characteristics–such as motives, intentions, or personality traits–and from “external” factors–such as the physical or social environment and other factors deemed out of one’s personal control. Self-serving creatures that we are, we tend to attribute our own successes to our intelligence, knowledge, skill, perseverance, and other positive personal traits. Our failures are blamed on bad luck, sabotage by others, a lost lucky charm, and other such things. These attribution biases are referred to as the dispositional attribution bias and the situational attribution bias. They are applied in reverse when we try to explain the actions of others. Others succeed because they’re lucky or have connections and they fail because they’re stupid, wicked, or lazy.

We may tend to attribute the behaviors of others to their intentions because it is cognitively easier to do so. We often have no idea about the situational factors that might influence another person or cause them to do what they do. We can usually easily imagine, however, a personal motive or personality trait that could account for most human actions. We usually have little difficulty in seeing when situational factors are at play in affecting our own behavior. In fact, people tend to over-emphasize the role of the situation in their own behaviors and under-emphasize the role of their own personal motives or personality traits. Social psychologists refer to this tendency as the actor-observer bias.
brains gears_600px
One lesson here is that we should be careful when interpreting the behavior of others. What might appear to be laziness, dishonesty, or stupidity might be better explained by situational factors of which we are ignorant. Another lesson is that we might be giving ourselves more credit for our actions than we deserve. The situation may have driven us more than we admit. Maybe we “just did what anybody would do in that situation” or maybe we were just lucky. We may want to follow the classical Greek maxim “know thyself,” but modern neuroscience has awakened us to the fact that much of our thinking goes on at the unconscious level and we often don’t know what is really motivating us to do what we do or think what we think.

Something similar to the self-serving attribution of positive traits to explain our own behavior and negative traits to explain the behavior of others occurs with regard to beliefs.

MORE . . .

The 12 cognitive biases that prevent you from being rational

By George Dvorsky viA io9.com

The human brain is capable of 1016 processes per second, which makes it far more powerful than any computer currently in existence. But that doesn’t mean our brains don’t have major limitations. The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless — plus, we’re subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions. Here are a dozen of the most common and pernicious cognitive biases that you need to know about.


Before we start, it’s important to distinguish between cognitive biases and logical fallacies. A logical fallacy is an error in logical argumentation (e.g. ad hominem attacks, slippery slopes, circular arguments, appeal to force, etc.). A cognitive bias, on the other hand, is a genuine deficiency or limitation in our thinking — a flaw in judgment that arises from errors of memory, social attribution, and miscalculations (such as statistical errors or a false sense of probability).

Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them. Here are some important ones to keep in mind.

Confirmation Bias

ConfirmationBias2We love to agree with people who agree with us. It’s why we only visit websites that express our political opinions, and why we mostly hang around people who hold similar views and tastes. We tend to be put off by individuals, groups, and news sources that make us feel uncomfortable or insecure about our views — what the behavioral psychologist B. F. Skinner called cognitive dissonance. It’s this preferential mode of behavior that leads to the confirmation bias — the often unconscious act of referencing only those perspectives that fuel our pre-existing views, while at the same time ignoring or dismissing opinions — no matter how valid — that threaten our world view. And paradoxically, the internet has only made this tendency even worse.

Ingroup Bias

medium_250pxSomewhat similar to the confirmation bias is the ingroup bias, a manifestation of our innate tribalistic tendencies. And strangely, much of this effect may have to do with oxytocin — the so-called “love molecule.” This neurotransmitter, while helping us to forge tighter bonds with people in our ingroup, performs the exact opposite function for those on the outside — it makes us suspicious, fearful, and even disdainful of others. Ultimately, the ingroup bias causes us to overestimate the abilities and value of our immediate group at the expense of people we don’t really know.

Gambler’s Fallacy

medium-1_250pxIt’s called a fallacy, but it’s more a glitch in our thinking. We tend to put a tremendous amount of weight on previous events, believing that they’ll somehow influence future outcomes. The classic example is coin-tossing. After flipping heads, say, five consecutive times, our inclination is to predict an increase in likelihood that the next coin toss will be tails — that the odds must certainly be in the favor of heads. But in reality, the odds are still 50/50. As statisticians say, the outcomes in different tosses are statistically independent and the probability of any outcome is still 50%.

Relatedly, there’s also the positive expectation bias — which often fuels gambling addictions. It’s the sense that our luck has to eventually change and that good fortune is on the way. It also contribues to the “hot hand” misconception. Similarly, it’s the same feeling we get when we start a new relationship that leads us to believe it will be better than the last one.

Post-Purchase Rationalization

Remember that time you bought something totally unnecessary, faulty, or overly expense, and then you rationalized the purchase to such an extent that you convinced yourself it was a great idea all along? Yeah, that’s post-purchase rationalization in action — a kind of built-in mechanism that makes us feel better after we make crappy decisions, especially at the cash register. Also known as Buyer’s Stockholm Syndrome, it’s a way of subconsciously justifying our purchases — especially expensive ones. Social psychologists say it stems from the principle of commitment, our psychological desire to stay consistent and avoid a state of cognitive dissonance.

Neglecting Probability

medium-2_250pxVery few of us have a problem getting into a car and going for a drive, but many of us experience great trepidation about stepping inside an airplane and flying at 35,000 feet. Flying, quite obviously, is a wholly unnatural and seemingly hazardous activity. Yet virtually all of us know and acknowledge the fact that the probability of dying in an auto accident is significantly greater than getting killed in a plane crash — but our brains won’t release us from this crystal clear logic (statistically, we have a 1 in 84 chance of dying in a vehicular accident, as compared to a 1 in 5,000 chance of dying in an plane crash [other sources indicate odds as high as 1 in 20,000]). It’s the same phenomenon that makes us worry about getting killed in an act of terrorism as opposed to something far more probable, like falling down the stairs or accidental poisoning.

This is what the social psychologist Cass Sunstein calls probability neglect — our inability to properly grasp a proper sense of peril and risk — which often leads us to overstate the risks of relatively harmless activities, while forcing us to overrate more dangerous ones.

MORE . . .

Temporal Binding

via NeuroLogica Blog

brainSkeptics should add another term to their lexicon of self-deception and cognitive biases – temporal binding.

Over the last half-century or so psychologists have been quietly documenting many various ways in which people deceive themselves and distort their thinking. This knowledge, however, has insufficiently penetrated the public consciousness. When it does it is mostly framed as, “isn’t that an interesting quirk of the human mind,” but the deeper lesson, that we cannot trust our own perception and memory, is rarely brought home.

Skeptics have taken modern neuroscience to heart. Our philosophy incorporates what I call “neuropsychological humility” – the basic recognition that our brains are subject to a host of flaws and biases, and therefore we cannot simply rely upon what we remember about what we thought we experienced. Rather, we need to rely upon a rational process and objective evidence as much as possible (part of this is relying on rigorous science to form our empirical conclusions). These flaws and biases are not confined to parlor tricks, contrived psychological experiments, and sitting in the audience of a magic show, but apply in everyday life.

Temporal binding is one tiny slice of the cognitive biases that form our everyday thinking. The overarching concept is that our memories are not passive recorders, nor are they primarily focused on the accurate recall of details. We do have a memory for details, but we also have a thematic memory, which seems to predominate. The thematic memory remembers the meaning of events, and then details are altered to fit this meaning. We construct a narrative and then over time our memory increasingly fits that narrative. This is not a conscious or deliberate process – our memories just morph over time. We are not aware of this process, nor can we distinguish an accurate memory from one that has morphed completely out of alignment with reality. They are both just memories.

Temporal binding is one manifestation of this general phenomenon, and is related to the logical fallacy, post hoc ergo propter hoc – after this therefore because of this. We tend to assume that if A precedes B then it is likely that A caused B. The logical fallacy is in assuming that A did in fact cause B without adequate independent evidence, merely because of the temporal association.

It seems that we evolved to make this assumption. Often A precedes B because it did cause it, and apparently there is a survival advantage to assuming that A probably did cause B, rather than being skeptical of this fact.

MORE . . .

Spontanous Human Stupidity

Published by via NeuroLogica Blog

Spontaneous Human Combustion (SHC) is one of those classic pseudosciences that have been around for a long time – like astrology, Big foot, and the Bermuda Triangle. I put it in the same category as the myth that we only use about 10% of our brain capacity; it’s widely believed, but no one really cares that much. It’s just something people hear about and have no reason to doubt, so they lazily accept it. I did when I was younger (in my pre-skeptical days), you hear about it on TV and think, “Huh, isn’t that interesting.”

It’s therefore a good opportunity to teach critical thinking skills. People’s brains are clogged with myths and false information, spread by rumor and the media, and accepted due to a lack of having the proper critical thinking filters in place. It’s disappointing, however, when people who should know better, or whose job it is to know better, fall for such myths.

Recently an Irish coroner concluded that a man died from SHC, and it is reported:

The West Galway coroner, Ciaran McLoughlin, said there was no other adequate explanation for the death of Michael Faherty, 76, also known as Micheal O Fatharta.


The coroner said: “This fire was thoroughly investigated and I’m left with the conclusion that this fits into the category of spontaneous human combustion, for which there is no adequate explanation.”

First, let’s play a game of name-that-logical-fallacy. The core fallacy the coroner is committing is the argument from ignorance. The investigation could not find a cause for the fire, therefore here is the specific cause – SHC. The conclusion should rather be – we don’t know what caused the fire.

The coroner said the case “fits into the category” of SHC – but how?

Keep Reading: NeuroLogica Blog » Spontanous Human Stupidity.

%d bloggers like this: