Baloney
Detection Kit
Warning
signs that suggest deception. Based on the book by Carl Sagan
"The Demon Haunted World". The following are suggested as tools for
testing arguments and detecting fallacious or fraudulent arguments: |
|
Wherever
possible there must be independent confirmation of the facts.
Encourage
substantive debate on the evidence by knowledgeable proponents of all points of view.
Arguments
from authority carry little weight (in science there are no "authorities").
Spin
more than one hypothesis - don't simply run with the first idea that caught your fancy.
Try not
to get overly attached to a hypothesis just because it's yours.
Quantify,
wherever possible.
If
there is a chain of argument every link in the chain must work.
Occam's
razor - if there are two hypothesis that explain the data equally well choose the simpler.
Ask
whether the hypothesis can, at least in principle, be falsified (shown to be false by some
unambiguous test). In other words, it is testable? Can others duplicate the experiment and
get the same result?
Additional
issues are:
Conduct
control experiments - especially "double blind" experiments where the person
taking measurements is not aware of the test and control subjects.
Check
for confounding factors - separate the variables.
Common
fallacies of logic and rhetoric
Ad
hominem - attacking the arguer and not the argument.
Argument
from "authority".
Argument
from adverse consequences (putting pressure on the decision maker by pointing out dire
consequences of an "unfavorable" decision).
Appeal
to ignorance (absence of evidence is not evidence of absence).
Special
pleading (typically referring to god's will).
Begging
the question (assuming an answer in the way the question is phrased).
Observational
selection (counting the hits and forgetting the misses).
Statistics
of small numbers (such as drawing conclusions from inadequate sample sizes).
Misunderstanding
the nature of statistics (President Eisenhower expressing astonishment and
alarm on discovering that fully half of all Americans have below average intelligence!)
Inconsistency
(e.g. military expenditures based on worst case scenarios but scientific projections
on environmental dangers thriftily ignored because they are not "proved").
Non
sequitur - "it does not follow" - the logic falls down.
Post
hoc, ergo propter hoc - "it happened after so it was caused by" - confusion of
cause and effect.
Meaningless
question ("what happens when an irresistible force meets an immovable object?).
Excluded
middle - considering only the two extremes in a range of possibilities (making the
"other side" look worse than it really is).
Short-term
v. long-term - a subset of excluded middle ("why pursue fundamental science when we
have so huge a budget deficit?").
Slippery
slope - a subset of excluded middle - unwarranted extrapolation of the effects (give an
inch and they will take a mile).
Confusion
of correlation and causation.
Caricaturing
(or stereotyping) a position to make it easier to attack.
Suppressed
evidence or half-truths.
Weasel
words - for example, use of euphemisms for war such as "police action" to get
around limitations on Presidential powers. "An important art of politicians is to
find new names for
institutions which under old names have become odious to the public"
(excerpted
from The Planetary Society Australian Volunteer Coordinators
Prepared by Michael Paine ) |