Pseudoscience is like bubble gum. It tastes pretty good, it’s fun to blow bubbles, and it annoys some people. But eventually, the flavor leaves, and you find that you’re just chewing on some nutritionally dubious substance. Now you have to find a place to spit it out.
Or I guess you can swallow it, and it stays in your intestines for the rest of your life. Oh sorry, that’s more junk science.
If you read something that makes some medical claim, here’s a quick and easy checklist to determine if it’s pseudoscience. Or real science-based medicine. What we all need is an official, Skeptical Raptor endorsed, pseudoscience detector.
Here it is, your own pseudoscience detector, based on a scientific seven-point checklist for fake science.
- The discoverer pitches his claim directly to the media. Going to media directly bypasses the all-important peer-review process, where real scientists can evaluate whether the claim is real science. There are some journalists that are thorough scientific skeptics, but it is rare. That’s why press releases rank near the bottom of acceptable scientific evidence.
- The discoverer says that a powerful establishment is trying to suppress his/her work. Special pleading for a conspiracy is just a logical fallacy. If someone discovers a cure for all cancers (probably not possible, since there are so many different cancers), the powers that be will be bringing truckloads of dollars to buy it, because they could market it for even more truckloads of money. But if you have no evidence that it cures all cancers, you’re not going get anything.
- The scientific effect is always at the very limit of detection. This is the very definition of “it doesn’t work.” Moreover, if the thing being promoted has a tiny effect, then more of it will have more of an effect, the typical dose-response relationship expected from all compounds.
- Evidence for a discovery is anecdotal. Anecdotes are not data. More anecdotes are not data. Anecdotes are not controlled, but they are subject to all sorts of bias. Like confirmation bias, where . . .