Thursday, October 2, 2008

Wishful Thinking

A reasoner who suggests that a claim is true, or false, merely because he or she strongly hopes it is, is committing the fallacy of wishful thinking. Wishing something is true is not a relevant reason for claiming that it is actually true.

Example:

There's got to be an error here in the history book. It says Thomas Jefferson had slaves. He was our best president, and a good president would never do such a thing. That would be awful.

Willed ignorance

I've got my mind made up, so don't confuse me with the facts. This is usually a case of the Traditional Wisdom Fallacy.

Example:

Of course she's made a mistake. We've always had meat and potatoes for dinner, and our ancestors have always had meat and potatoes for dinner, and so nobody knows what they're talking about when they start saying meat and potatoes are bad for us.

Weak Analogy

See False Analogy.

Untestability

See Unfalsifiability.

Unrepresentative Sample

If the means of collecting the sample from the population are likely to produce a sample that is unrepresentative of the population, then a generalization upon the sample data is an inference committing the fallacy of unrepresentative sample. A kind of hasty generalization. When some of the statistical evidence is expected to be relevant to the results but is hidden or overlooked, the fallacy is called suppressed evidence.

Example:

The two men in the matching green suits that I met at the Star Trek Convention in Las Vegas had a terrible fear of cats. I remember their saying they were from Delaware. I've never met anyone else from Delaware, so I suppose everyone there has a terrible fear of cats.
Most people's background information is sufficient to tell them that people at this sort of convention are unlikely to be representative, that is, typical members of society.

Large samples can be unrepresentative, too.
Example:

We've polled over 400,000 Southern Baptists and asked them whether the best religion in the world is Southern Baptist. We have over 99% agreement, which proves our point about which religion is best.
Getting a larger sample size does not overcome sampling bias.

Unfalsifiability

This error in explanation occurs when the explanation contains a claim that is not falsifiable, because there is no way to check on the claim. That is, there would be no way to show the claim to be false if it were false.

Example:

He lied because he's possessed by demons.
This could be the correct explanation of his lying, but there's no way to check on whether it's correct. You can check whether he's twitching and moaning, but this won't be evidence about whether a supernatural force is controlling his body. The claim that he's possessed can't be verified if it's true, and it can't be falsified if it's false. So, the claim is too odd to be relied upon for an explanation of his lying. Relying on the claim is an instance of fallacious reasoning.

Undistributed Middle

In syllogistic logic, failing to distribute the middle term over at least one of the other terms is the fallacy of undistributed middle. Also called the fallacy of maldistributed middle.

Example:

All collies are animals.
All dogs are animals.
Therefore, all collies are dogs.
The middle term ("animals") is in the predicate of both universal affirmative premises and therefore is undistributed. This formal fallacy has the logical form: All C are A. All D are A. Therefore, all C are D.

Two Wrongs Make a Right

When you defend your wrong action as being right because someone previously has acted wrongly, you commit the fallacy called "two wrongs make a right." This is a kind of ad hominem fallacy.

Example:

Oops, no paper this morning. Somebody in our apartment building probably stole my newspaper. So, that makes it OK for me to steal one from my neighbor's doormat while nobody else is out here in the hallway.

Tu Quoque

The fallacy of tu quoque is committed if we conclude that someone's argument not to perform some act must be faulty because the arguer himself or herself has performed it. Similarly, when we point out that the arguer doesn't practice what he preaches, we may be therefore suppose that there must be an error in the preaching, but we are reasoning fallaciously and creating a tu quoque. This is a kind of ad hominem fallacy.

Example:

You say I shouldn't become an alcoholic because it will hurt me and my family, yet you yourself are an alcoholic, so your argument can't be worth listening to.
Discovering that a speaker is a hypocrite is a reason to be suspicious of the speaker's reasoning, but it is not a sufficient reason to discount it.

Traditional Wisdom

If you say or imply that a practice must be OK today simply because it has been the apparently wise practice in the past, you commit the fallacy of traditional wisdom. Procedures that are being practiced and that have a tradition of being practiced might or might not be able to be given a good justification, but merely saying that they have been practiced in the past is not always good enough, in which case the fallacy is committed. Also called argumentum consensus gentium when the traditional wisdom is that of nations.

Example:

Of course we should buy IBM's computer whenever we need new computers. We have been buying IBM as far back as anyone can remember.
The "of course" is the problem. The traditional wisdom of IBM being the right buy is some reason to buy IBM next time, but it's not a good enough reason in a climate of changing products, so the "of course" indicates that the fallacy of traditional wisdom has occurred.

Tokenism

If you interpret a merely token gesture as an adequate substitute for the real thing, you've been taken in by tokenism.

Example:

How can you call our organization racist? After all, our receptionist is African American.
If you accept this line of reasoning, you have been taken in by tokenism.

Syllogistic

Syllogistic fallacies are kinds of invalid categorical syllogisms. This list contains the fallacy of undistributed middle and the fallacy of four terms, and a few others though there are a great many such formal fallacies.

Sweeping Generalization

See Fallacy of Accident.

Suppressed Evidence

Intentionally failing to use information suspected of being relevant and significant is committing the fallacy of suppressed evidence. This fallacy usually occurs when the information counts against one's own conclusion. Perhaps the arguer is not mentioning that experts have recently objected to one of his premises. The fallacy is a kind of fallacy of Selective Attention.

Example:

Buying the Cray Mac 11 computer for our company was the right thing to do. It meets our company's needs; it runs the programs we want it to run; it will be delivered quickly; and it costs much less than what we had budgeted.
This appears to be a good argument, but you'd change your assessment of the argument if you learned the speaker has intentionally suppressed the relevant evidence that the company's Cray Mac 11 was purchased from his brother-in-law at a 30 percent higher price than it could have been purchased elsewhere, and if you learned that a recent unbiased analysis of ten comparable computers placed the Cray Mac 11 near the bottom of the list. If the relevant information is not intentionally suppressed by rather inadvertently overlooked, the fallacy of suppressed evidence also is said to occur, although the fallacy's name is misleading in this case. The fallacy is also called the Fallacy of Incomplete Evidence and Cherry-Picking the Evidence. See also Slanting.

Superstitious Thinking

Reasoning deserves to be called superstitious if it is based on reasons that are well known to be unacceptable, usually due to unreasonable fear of the unknown, trust in magic, or an obviously false idea of what can cause what. A belief produced by superstitious reasoning is called a superstition. The fallacy is an instance of the False Cause Fallacy.

Example:

I never walk under ladders; it's bad luck.
It may be a good idea not to walk under ladders, but a proper reason to believe this is that workers on ladders occasionally drop things, and that ladders might have dripping wet paint that could damage your clothes. An improper reason for not walking under ladders is that it is bad luck to do so.

Subjectivist

The subjectivist fallacy occurs when it is mistakenly supposed that a good reason to reject a claim is that truth on the matter is relative to the person or group.

Example:

Justine has just given Jake her reasons for believing that the Devil is an imaginary evil person. Jake, not wanting to accept her conclusion, responds with, "That's perhaps true for you, but it's not true for me."

Style Over Substance

Unfortunately the style with which an argument is presented is sometimes taken as adding to the substance or strength of the argument.

Example:

You've just been told by the salesperson that the new Maytag is an excellent washing machine because it has a double washing cycle. If you were to notice that the salesperson smiled at you and was well dressed, this wouldn't add to the quality of the original argument, but unfortunately it does for those who are influenced by style over substance, as most of us are.

Straw Man

You commit the straw man fallacy whenever you attribute an easily refuted position to your opponent, one that the opponent wouldn't endorse, and then proceed to attack the easily refuted position believing you have undermined the opponent's actual position. If the misrepresentation is on purpose, then the straw man fallacy is caused by lying.

Example (a debate before the city council):

Opponent: Because of the killing and suffering of Indians that followed Columbus's discovery of America, the City of Berkeley should declare that Columbus Day will no longer be observed in our city.

Speaker: This is ridiculous, fellow members of the city council. It's not true that everybody who ever came to America from another country somehow oppressed the Indians. I say we should continue to observe Columbus Day, and vote down this resolution that will make the City of Berkeley the laughing stock of the nation.

The speaker has twisted what his opponent said; the opponent never said, nor even indirectly suggested, that everybody who ever came to America from another country somehow oppressed the Indians.

Stereotyping

Using stereotypes as if they are accurate generalizations for the whole group is an error in reasoning. Stereotypes are general beliefs we use to categorize people, objects, and events; but these beliefs are overstatements that shouldn't be taken literally. For example, consider the stereotype "She’s Mexican, so she’s going to be late." This conveys a mistaken impression of all Mexicans. On the other hand, even though most Mexicans are punctual, a German is more apt to be punctual than a Mexican, and this fact is said to be the "kernel of truth" in the stereotype. The danger in our using stereotypes is that speakers or listeners will not realize that even the best stereotypes are accurate only when taken probabilistically. As a consequence, the use of stereotypes can breed racism, sexism, and other forms of bigotry.

Example:

German people aren't good at dancing our sambas. She's German. So, she's not going to be any good at dancing our sambas.
This argument is deductively valid, but it's unsound because it rests on a false, stereotypical premise. The grain of truth in the stereotype is that the average German doesn't dance sambas as well as the average South American, but to overgeneralize and presume that ALL Germans are poor samba dancers compared to South Americans is a mistake called "stereotyping."

Stacking the Deck

See Suppressed Evidence and Slanting.

Specificity

Drawing an overly specific conclusion from the evidence. A kind of jumping to conclusions.

Example:

The trigonometry calculation came out to 35,005.6833 feet, so that's how wide the cloud is up there.

Special Pleading

Special pleading is a form of inconsistency in which the reasoner doesn't apply his or her principles consistently. It is the fallacy of applying a general principle to various situations but not applying it to a special situation that interests the arguer even though the general principle properly applies to that special situation, too.

Example:

Everyone has a duty to help the police do their job, no matter who the suspect is. That is why we must support investigations into corruption in the police department. No person is above the law. Of course, if the police come knocking on my door to ask about my neighbors and the robberies in our building, I know nothing. I'm not about to rat on anybody.
In our example, the principle of helping the police is applied to investigations of police officers but not to one's neighbors.

Sorites

See Line-Drawing.

Smokescreen

This fallacy occurs by offering too many details in order either to obscure the point or to cover-up counter-evidence. In the latter case it would be an example of the fallacy of suppressed evidence. If you produce a smokescreen by bringing up an irrelevant issue, then you produce a red herring fallacy. Sometimes called clouding the issue.

Example:

Senator, wait before you vote on Senate Bill 88. Do you realize that Delaware passed a bill on the same subject in 1932, but it was ruled unconstitutional for these twenty reasons. Let me list them here.... Also, before you vote on SB 88 you need to know that .... And so on.
There is no recipe to follow in distinguishing smokescreens from reasonable appeals to caution and care.

Smear Tactic

A smear tactic is an unfair characterization either of the opponent or the opponent's position or argument. Smearing the opponent causes an ad hominem fallacy. Smearing the opponent's argument causes a straw man fallacy.

Small Sample

This is the fallacy of using too small a sample. If the sample is too small to provide a representative sample of the population, and if we have the background information to know that there is this problem with sample size, yet we still accept the generalization upon the sample results, then we commit the fallacy. This fallacy is the fallacy of hasty generalization, but it emphasizes statistical sampling techniques.

Example:

I've eaten in restaurants twice in my life, and both times I've gotten sick. I've learned one thing from these experiences: restaurants make me sick.
How big a sample do you need to avoid the fallacy? Relying on background knowledge about a population's lack of diversity can reduce the sample size needed for the generalization. With a completely homogeneous population, a sample of one is large enough to be representative of the population; if we've seen one electron, we've seen them all. However, eating in one restaurant is not like eating in any restaurant, so far as getting sick is concerned. We cannot place a specific number on sample size below which the fallacy is produced unless we know about homogeneity of the population and the margin of error and the confidence level.

Slippery Slope

Suppose someone claims that a first step (in a chain of causes and effects, or a chain of reasoning) will probably lead to a second step that in turn will probably lead to another step and so on until a final step ends in trouble. If the likelihood of the trouble occurring is exaggerated, the slippery slope fallacy is committed.

Example:

Mom: Those look like bags under your eyes. Are you getting enough sleep?

Jeff: I had a test and stayed up late studying.

Mom: You didn't take any drugs, did you?

Jeff: Just caffeine in my coffee, like I always do.

Mom: Jeff! You know what happens when people take drugs! Pretty soon the caffeine won't be strong enough. Then you will take something stronger, maybe someone's diet pill. Then, something even stronger. Eventually, you will be doing cocaine. Then you will be a crack addict! So, don't drink that coffee.

The form of a slippery slope fallacy looks like this:
A leads to B.
B leads to C.
C leads to D.
...
Z leads to HELL.
We don't want to go to HELL.
So, don't take that first step A.
Think of the sequence A, B, C, D, ..., Z as a sequence of closely stacked dominoes. The key claim in the fallacy is that pushing over the first one will start a chain reaction of falling dominoes, each one triggering the next. But the analyst asks how likely is it really that pushing the first will lead to the fall of the last? For example, if A leads to B with a probability of 80 percent, and B leads to C with a probability of 80 percent, and C leads to D with a probability of 80 percent, is it likely that A will eventually lead to D? No, not at all; there is about a 50- 50 chance. The proper analysis of a slippery slope argument depends on sensitivity to such probabilistic calculations. Regarding terminology, if the chain of reasoning A, B, C, D, ..., Z is about causes, then the fallacy is called the Domino Fallacy.

Slanting

This error occurs when the issue is not treated fairly because of misrepresenting the evidence by, say, suppressing part of it, or misconstruing some of it, or simply lying. See the following fallacies: Lying, Misrepresentation, Questionable Premise, Quoting out of Context, Straw Man, Suppressed Evidence.

Sharpshooter's

The sharpshooter's fallacy gets its name from someone shooting a rifle at the side of the barn and then going over and drawing a target and bulls eye concentrically around the bullet hole. The fallacy is caused by overemphasizing random results or making selective use of coincidence. See the Fallacy of Selective Attention.

Example:

Psychic Sarah makes twenty-six predictions about what will happen next year. When one, but only one, of the predictions comes true, she says, "Aha! I can see into the future."

Self-Fulfilling Prophecy

The fallacy occurs when the act of prophesying will itself produce the effect that is prophesied, but the reasoner doesn't recognize this and believes the prophesy is a significant insight.

Example:

A group of students are selected to be interviewed individually by the teacher. Each selected student is told that the teacher has predicted they will do significantly better in their future school work. Actually, though, the teacher has no special information about the students and has picked the group at random. If the students believe this prediction about themselves, then, given human psychology, it is likely that they will do better merely because of the teacher's making the prediction.
The prediction will fulfill itself, so to speak, and the students commit the fallacy. This fallacy can be dangerous in an atmosphere of potential war between nations when the leader of a nation predicts that their nation will go to war against their enemy. This prediction could very well precipitate an enemy attack because the enemy calculates that if war is inevitable then it is to their military advantage not to get caught by surprise.

Selective Attention

Improperly focusing attention on certain things and ignoring others.

Example:

Father: Justine, how was your school day today? Another C on the history test like last time?
Justine: Dad, I got an A- on my history test today. Isn't that great? Only one student got an A.
Father: I see you weren't the one with the A. And what about the math quiz?
Justine: I think I did OK, better than last time.
Father: If you really did well, you'd be sure. What I'm sure of is that today was a pretty bad day for you.
The pessimist who pays attention to all the bad news and ignores the good news thereby commits the fallacy of selective attention. The remedy for this fallacy is to pay attention to all the relevant evidence. The most common examples of selective attention are the fallacy of Suppressed Evidence and the fallacy of Confirmation Bias. See also the Sharpshooter's Fallacy.

Secundum Quid

See Accident and Converse Accident, two versions of the fallacy.

Scope

The scope fallacy is caused by improperly changing or misrepresenting the scope of a phrase.

Example:

Every concerned citizen who believes that someone living in the US is a terrorist should make a report to the authorities. But Shelley told me herself that she believes there are terrorists living in the US, yet she hasn't made any reports. So, she must not be a concerned citizen.
The first sentence has ambiguous scope. It was probably originally meant in this sense: Every concerned citizen who believes (of someone that this person is living in the US and is a terrorist) should make a report to the authorities. But the speaker is clearly taking the sentence in its other, less plausible sense: Every concerned citizen who believes (that there is someone or other living in the US who is a terrorist) should make a report to the authorities. Scope fallacies usually are amphibolies.

Scare Tactic

If you suppose that terrorizing your opponent is giving him a reason for believing that you are correct, you are using a scare tactic and reasoning fallaciously.

Example:

David: My father owns the department store that gives your newspaper fifteen percent of all its advertising revenue, so I'm sure you won't want to publish any story of my arrest for spray painting the college.

Newspaper editor: Yes, David, I see your point. The story really isn't newsworthy.

David has given the editor a financial reason not to publish, but he has not given a relevant reason why the story is not newsworthy. David's tactics are scaring the editor, but it's the editor who commits the scare tactic fallacy, not David. David has merely used a scare tactic. This fallacy's name emphasizes the cause of the fallacy rather than the error itself. See also the related fallacy of appeal to emotions.

Scapegoating

If you unfairly blame an unpopular person or group of people for a problem, then you are scapegoating. This is a kind of fallacy of appeal to emotions.

Example:

Augurs were official diviners of ancient Rome. During the pre-Christian period, when Christians were unpopular, an augur would make a prediction for the emperor about, say, whether a military attack would have a successful outcome. If the prediction failed to come true, the augur would not admit failure but instead would blame nearby Christians for their evil influence on his divining powers. The elimination of these Christians, the augur would claim, could restore his divining powers and help the emperor. By using this reasoning tactic, the augur was scapegoating the Christians.

Reversing Causation

Drawing an improper conclusion about causation due to a causal assumption that reverses cause and effect. A kind of false cause fallacy.

Example:

All the corporate officers of Miami Electronics and Power have big boats. If you're ever going to become an officer of MEP, you'd better get a bigger boat.
The false assumption here is that having a big boat helps cause you to be an officer in MEP, whereas the reverse is true. Being an officer causes you to have the high income that enables you to purchase a big boat.

Regression

This fallacy occurs when regression to the mean is mistaken for a sign of a causal connection. Also called the Regressive Fallacy. It is a kind of false cause fallacy.

Example:

You are investigating the average heights of groups of Americans. You sample some people from Chicago and determine their average height. You have the figure for the mean height of Americans and notice that your Chicagoans have an average height that differs from this mean. Your second sample of the same size is from people from Miami. When you find that this group's average height is closer to the American mean height [as it is very likely to be due to common statistical regression to the mean], you falsely conclude that there must be something causing Miamians rather than Chicagoans be more like the average American.
There is most probably nothing causing Miamians to be more like the average American; but rather what is happening is that averages are regressing to the mean.

Refutation by Caricature

See Ad Hominem.

Red Herring

A red herring is a smelly fish that would distract even a bloodhound. It is also a digression that leads the reasoner off the track of considering only relevant information.

Example:

Will the new tax in Senate Bill 47 unfairly hurt business? One of the provisions of the bill is that the tax is higher for large employers (fifty or more employees) as opposed to small employers (six to forty-nine employees). To decide on the fairness of the bill, we must first determine whether employees who work for large employers have better working conditions than employees who work for small employers.
Bringing up the issue of working conditions is the red herring.

Rationalization

We rationalize when we inauthentically offer reasons to support our claim. We are rationalizing when we give someone a reason to justify our action even though we know this reason is not really our own reason for our action, usually because the offered reason will sound better to the audience than our actual reason.
Example:

"I bought the matzo bread from Kroger's Supermarket because it is the cheapest brand and I wanted to save money," says Alex [who knows he bought the bread from Kroger's Supermarket only because his girlfriend works there].

Quoting out of Context

If you quote someone, but select the quotation so that essential context is not available and therefore the person's views are distorted, then you've quoted "out of context." Quoting out of context in an argument creates a straw man fallacy.

Example:

Smith: I've been reading about a peculiar game in this article about vegetarianism. When we play this game, we lean out from a fourth-story window and drop down strings containing "Free food" signs on the end in order to hook unsuspecting passers-by. It's really outrageous, isn't it? Yet isn't that precisely what sports fishermen do for entertainment from their fishing boats? The article says it's time we put an end to sport fishing.

Jones: Let me quote Smith for you. He says "We...hook unsuspecting passers-by." What sort of moral monster is this man Smith?

Jones's selective quotation is fallacious because it makes Smith appear to advocate this immoral activity when the context makes it clear that he doesn't.

Quibbling

We quibble when we complain about a minor point and falsely believe that this complaint somehow undermines the main point. To avoid this error, the logical reasoner will not make a mountain out of a mole hill nor take people too literally.

Example:

I've found typographical errors in your poem, so the poem is neither inspired nor perceptive.

Questionable Premise

If you have sufficient background information to know that a premise is questionable or unlikely to be acceptable, then you commit this fallacy if you accept an argument based on that premise. This broad category of fallacies of argumentation includes appeal to authority, false dilemma, inconsistency, lying, stacking the deck, straw man, suppressed evidence, and many others.

Questionable Cause

See False Cause.

Questionable Analogy

See False Analogy.

Wednesday, October 1, 2008

Prejudicial Language

See Loaded Language.

Post Hoc

Suppose we notice that an event of kind A is followed in time by an event of kind B, and then hastily leap to the conclusion that A caused B. If so, we commit the post hoc fallacy. Correlations are often good evidence of causal connection, so the fallacy occurs only when the leap to the causal conclusion is done "hastily." The Latin term for the fallacy is post hoc, ergo propter hoc ("After this, therefore because of this"). It is a kind of false cause fallacy.

Example:

I ate in that Ethiopian restaurant three days ago and now I've just gotten food poisoning. The only other time I've eaten in an Ethiopian restaurant I also got food poisoning, but that time I got sick a week later. My eating in those kinds of restaurants is causing my food poisoning.
Your background knowledge should tell you this is unlikely because the effects of food poisoning are felt soon after the food is eaten. Before believing your illness was caused by eating in an Ethiopian restaurant, you'd need to rule out other possibilities, such as your illness being caused by what you ate a few hours before the onset of the illness.

Poisoning the Well

Poisoning the well is a preemptive attack on a person in order to discredit their testimony or argument in advance of their giving it. A person who thereby becomes unreceptive to the testimony reasons fallaciously and has become a victim of the poisoner. This is a kind of ad hominem.

Example:

[Prosecuting attorney in court] When is the defense attorney planning to call that twice-convicted child molester, David Barnington, to the stand? OK, I'll rephrase that. When is the defense attorney planning to call David Barnington to the stand?

Petitio Principii

See Begging the Question.

Perfectionist

If you remark that a proposal or claim should be rejected solely because it doesn't solve the problem perfectly, in cases where perfection isn't really required, then you've committed the perfectionist fallacy.

Example:

You said hiring a house cleaner would solve our cleaning problems because we both have full-time jobs. Now, look what happened. Every week she unplugs the toaster oven and leaves it that way. I should never have listened to you about hiring a house cleaner.

Persuasive Definition

Some people try to win their arguments by getting you to accept their faulty definition. If you buy into their definition, they've practically persuaded you already. Same as the Definist Fallacy. Poisoning the Well when presenting a definition would be an example of a using persuasive definition.

Example:

Let's define a Democrat as a leftist who desires to overtax the corporations and abolish freedom in the economic sphere.

Pathetic

The pathetic fallacy is a mistaken belief due to attributing peculiarly human qualities to inanimate objects (but not to animals). The fallacy is caused by anthropomorphism.

Example:

Aargh, it won't start again. This old car always breaks down on days when I have a job interview. It must be afraid that if I get a new job, then I'll be able to afford a replacement, so it doesn't want me to get to my interview on time.

Past Practice

See Traditional Wisdom.

Oversimplification

You oversimplify when you cover up relevant complexities or make a complicated problem appear to be too much simpler than it really is.

Example:

President Bush wants our country to trade with Fidel Castro's Communist Cuba. I say there should be a trade embargo against Cuba. The issue in our election is Cuban trade, and if you are against it, then you should vote for me for president.
Whom to vote for should be decided by considering quite a number of issues in addition to Cuban trade. When an oversimplification results in falsely implying that a minor causal factor is the major one, then the reasoning also commits the false cause fallacy.

Recent Visitors

Popular Pages Today: