“Words have a magical power. They can bring either the greatest happiness or deepest despair; they can transfer knowledge from teacher to student; words enable the orator to sway his audience and dictate its decisions. Words are capable of arousing the strongest emotions and prompting all men’s actions.” – Sigmund Freud
“One man’s ‘magic’ is another man’s engineering. ‘Supernatural’ is a null word.” – Robert A. Heinlein
Words are magic . . . or so it seems. Words can make people change their minds. Words can make others take actions even against their own best interests. Words can shape the world, determine the fate of nations and people, create and destroy. However, as Robert Heinlein noted, one man’s magic is another man’s engineering and in the modern world, propaganda is the most engineered form of communication possible.
Magica verba est scientia et ars es.
The magic of words is science and art.
The science is in the methodology and psychology of execution. The art is in making the message appealing. This is the essence of rhetoric. How is this so? Let us first consider the methodologies of propaganda as a form of rhetoric before we look at the psychology behind these tactics. Although the psychology applies to both negative (black), positive (white) and value neutral (grey) uses of propaganda, in the context of this portion of the discussion, the word “propaganda” should be viewed with its maximum possible negative value load, i.e. the kind of bad propaganda designed to get you to act against your best interests or to harm others. Why? Because many of these tactics favored modern political polemicists are rooted in logical fallacies and outright lies. Knowing “snakes” as a category isn’t as useful as knowing “pit vipers” as a sub-category when the survival of the species can be at stake so we’ll consider the dangerous kinds of propaganda first. Why? Because if you treat all snakes like they are dangerous, then you are less likely to get bitten.
The Façade of Reason – The Role of Logical Fallacies in Propaganda
First, we need to differentiate between the terms “strategy” and “tactics”. Strategy is defined in relevant part by Webster’s as “the science and art of employing the political, economic, psychological, and military forces of a nation or group of nations to afford the maximum support to adopted policies in peace or war”. Tactics, by contrast, is defined in relevant part by Webster’s as “the art or skill of employing available means to accomplish an end” and “the study of the grammatical relations within a language including morphology and syntax”. By better understanding the tactics of propagandists, you not only gain a certain degree of immunity from their influence, but insight into their strategic ends.
Many of these tactics rely upon logical fallacies. Etymologically speaking, most everyone knows that fallacies are falsehoods, but for the purposes of this discussion consider again Webster’s definition in relevant part in that a fallacy is “an often plausible argument using false or invalid inference”. This is one of the reasons logicians are arduously trained to spot fallacies and why they are so dangerous to the consumer. Logical fallacies but especially informal logical fallacies can provide a façade of reason, a mask of legitimacy, to an argument but are in fact logically and/or factually flawed. They accomplish this by being subtle flaws and/or appealing to naturally occurring predispositions in human psychology. This can range from being simply wrong or mistaken to deliberate lies depending upon the speaker and their possible motivations for inciting you to adopt their stance on an issue or in taking or failing to take a given action. Illogic is simply illogic unless the speaker is intentionally trying to be misleading. Not everyone who is illogical is a propagandist, but at some level, every propagandist preys upon illogic and uses illogic as a tool to convince you of a reality that does not exist.
By knowing the tactics and methodology of propagandists, you can deconstruct their statements, allowing you to sort through the truths and the lies; to think about the issues as they are without the filter of their perceptions and goals steering your thinking. By deconstruction of their statements, you can find the truth. Truthful decisions, no matter how ugly the truth might be, are always better than misinformed decisions or decisions made upon prime facie lies. Although a famous Roman Emperor once said that truth is a perspective, he also said that . . .
“If someone is able to show me that what I think or do is not right, I will happily change, for I seek the truth, by which no one was ever truly harmed. It is the person who continues in his self-deception and ignorance who is harmed.”― Marcus Aurelius
The Tactics of Propaganda
Name Calling and Labeling/Mislabeling – Although both name calling and labeling tactics are common, I think they are best understood when consolidated under the term of “mislabeling”. Labeling in and of itself has utility. To return to the wisdom of Marcus Aurelius, ask of each and every thing what is it in itself. To that end, an accurate label is a summation, the encapsulation of an idea. Where we run into trouble is when labels are misapplied or used solely to conjure a negative implicit or explicit relationship. When someone engages in this tactic (or is the victim of it), look first at the denotation of the word(s) being used. Are they accurate? It is not name calling when you describe someone acting in a sociopathic manner a sociopath. It is merely accurate if that is consistent with the behavior the person in question displays. If the label being applied is inaccurate, then that is your first hint that it is mislabeling and the speaker’s motivation should be suspect. A good way to deal with this tactic is to turn it back upon the user either directly or by deconstruction and clarification; make definitions – preferably objective definitions from credible sources – work for you and against them. This tactic is common on blogs and this counter-tactic is best suited for such an interactive environment, however, it is practiced elsewhere in media. For example, anti-abortion articles that refer to doctor who provide that legal and necessary service as “murderers”.
Even if the denotation of the word or words is accurate, ask yourself if there is a negative connotation to the word being used? For example, in modern American English, saying someone is a black man is an accurate term if that man is indeed ethnically black and would not raise an eyebrow under normal circumstances (context matters, but we are talking about labels only at this time). Now consider if that same speaker used the term “colored man”? If you stick to the strictest meaning of the word “colored” as defined by Webster’s (“having color”), then this may be an accurate label as applied to a black man. However, if you consider the broader meaning of the word “colored”, you’d know that using that word to describe persons of races other than the white or of mixed race is often – in my experience always – considered offensive. It carries a negative connotation of diminution, an implication of inferiority based on skin color. Of course, this is nonsense, but it is an example of a connotation being put to bad ends. This should also lead you to question the speaker’s motives.
Loaded Language – At the beginning of this series, we looked at the value of word choice and how denotation and connotation could be manipulated to bias the “value load” of language. Technically this process is referred to as using euphemisms or dyphemisms. A euphemism is when you change the value of load of a word positively by substituting a less harsh word, for example using the term “police action” instead of “war”. A dysphemism is when you change the value load of a word negatively by substituting a harsher or even profane word, for example calling LEOs “pigs” instead of “police”. The best defense against loaded language is to have a broad vocabulary and a willingness to use both a dictionary and a thesaurus when you see words that elicit an emotional response. Group antonyms (and synonyms) according to their perceived value as “virtue” words (euphemistic value) or “devil” words (dysphemistic value). Considering denotation as well as other alternative word choices can be quite illuminating in deciding whether the speaker is using purposefully loaded language. A good counter-tactic for loaded language is identification of the tactic and clarification of terms.
Both labeling and loaded language can be considered forms of transfer or the logical fallacy of guilt by association.
Generalities – Speaking in generalities is an artful form of the logical fallacies of composition, division or false equivalences in that statements are made in general, vague or inadequate manner to state a truth about a part based on the whole, about the whole based on a part or about a part or a whole by creating a false connotative connection that isn’t causal. When a generality crosses into propaganda, it often crosses in to the realm of the informal logical fallacy, the faulty generalization, but those are very specific kinds of fallacies and they are listed and addressed in detail below. Another form is the use “glittering generalities” where the word choice is based entirely on positive emotional appeal but don’t have any real substance. This is sometimes called the P.T. Barnum tactic or the “baffle them with bull” tactic after the famous W.C. Fields quote, “If you can’t dazzle them with brilliance, baffle them with bull.” The best way to combat generalities is by logical dissection of the argument and comparing the objective nature of the evidence to the assertions made to see if there is causal connection or not.
Transfer/Guilt by Association and False Equivalence – Actually two distinct tactics, they are grouped together because often one is used to create the other, i.e. a false equivalence can be used to create guilt by association or vice versa. Consider the recent revelations about the Sandusky/Penn State/Second Mile child abuse scandal and how the taint of Sandusky’s crimes has (rightfully) spread to others and other organizations or the Idaho billboards (wrongfully) comparing Obama to the Aurora shooter. This is a form of red herring fallacy, but it is so prevalent in propaganda it merits separate mention.
Cherry Picking (Selective Truth) – This tactic can combine several approaches to one net effect: biasing data. This can be done by incomplete comparisons, inconsistent comparisons, quoting out of context, appealing to authority, and causal oversimplification just to name a few of the tactics that can be used to cherry pick. It is in summary selecting data that supports a position while ignoring data that refutes a position. The best defense against this tactic is to always ask “is this all the relevant information?”
False Causal Analysis – This tactic is where informal logical fallacies really come into their own by using combinations of tactics to create false causes for consequents. For example, consider the “arguments” by Rep. Louie Gohmert (R-Texas) that the tragedy in Aurora was a result of “ongoing attacks on Judeo-Christian beliefs” which uses outright deception combined with an appeal to emotion with a dash of bandwagon and pandering. This is a false causal analysis using multiple tactics/fallacies.
Appeals to Emotion – This tactic is in itself an informal logical fallacy, a form of faulty generalization. There are two basic groupings here: Appeals to Pure Emotion (i.e. a direct appeal designed to elicit a particular emotion) and Appeals to Distilled Emotion (i.e. indirect appeals to a set of emotions). The lists below are not all inclusive, but they highlight the most commonly used emotions targeted by speakers.
Appeals to Pure Emotion: Fear, Anger, Humor, Sentiment/Nostalgia, Pity, Flattery, Ridicule, and Spite
Appeals to Distilled Emotion – These appeals to emotion are more complex than the pure appeals. They will be addressed in greater depth in the section on the psychology of propaganda, but they merit mention now with their rhetorical cousin(s).
- Testimonial – Less an appeal to pure emotion, this tactic rooted is in Social Proofing and Authority .
- Plain Folks – Sometimes confused with the Band Wagon tactic, the Plain Folks tactic is actually a variation on the same psychological mechanisms that Band Wagon relies upon – Social Proofing, Liking and Authority.
- The Desire for Certainty – This plays to the human mind’s propensity to deal with uncertainty by filling in factual gaps in knowledge with beliefs and suppositions it treats as fact to allow the mind to come to a certain conclusion where there may not be one. This is closely related to the Desire for Consistency.
- Wishful Thinking – When a decision is made according to what might be pleasing to imagine, rather than according to evidence or reason. It is a form of delusional thinking and is closely related to another psychological mechanism – denial.
The best way to address appeals to emotion is to identify them as such and then illustrate why an emotional response will not be helpful to resolving the issue in question, preferably while offering a rational and viable alternative solution. The ability to read and write with detachment as well as the ability to apply logic dispassionately aids in dealing with appeals to emotion.
Band Wagon and False Consensus – The Band Wagon tactic is a variation of the appeal to popularity (argumentum ad populum) that is best illustrated by the line, “All the really cool kids are doing it.” It is, like that illustrative line, a form of peer pressure. Where it comes into full bloom as a propaganda tactic is when it is combined with false consensus. False consensus is a tactic where posters use “sockpuppet” identities to basically agree with themselves and create the false illusion that the proposition is good by consensus. The best way to combat this tactic is to familiarize yourself with poster’s writing styles so sockpuppets are easier to spot, frequent sites that use moderation to mitigate the effects of sockpuppetry. If you have the skills, you might even write analytical software that spots sockpuppets based on published public data. It has been my experience on this particular blog and topic though that when people use technological solutions to spot sockpuppets, the sockpuppeteers get really pissed off at not being able to lie without the threat of easy detection of their tactics. While this tactic may be overboard, I have found that simply being able to recognize writer’s by “their voice” works almost as well. This is a skill that you may or may not be able to acquire though. It’s analogous to having an ear for dialects. You simply may or may not have the proclivity and predisposition to do this effectively, but it never hurts to try. When combined, these two tactics are illustrative of the tactic known as astroturfing. These tactics can also be used in divide and conquer strategies.
Red Herrings – This tactic actually covers an broad range of informal logical fallacies that all amount to one move tactically speaking: misdirection. This is addressed below in greater detail along with the other informal logical fallacies.
Simple Solutions/Repetition/The Big Lie – This group of tactics is interrelated but they are all stand alone tactics in their own right. First, the Fallacy of the Single Cause or causal oversimplification, is exactly what it sounds like: making the cause of a premise or conclusion simpler than it actually is to avoid or obscure other causal factors that might steer the argument in other directions. This can take several forms including improper definitions, “pat” answers and binary thinking. This works because simple bits of information are more readily absorbed by consumers than complex concepts. To make an idea, simple or otherwise, “stick” to a consumer, repetition works. This is because of the psychology of operant conditioning or learning by imitation. Monkey see, monkey do. You see it in action every day with advertising. When you combine simple solutions and repetition you get a third tactic that is far more dangerous than either upon their own: the Big Lie. The term Big Lie (Große Lüge) was coined by Adolph Hitler in Mein Kampf, but the tactic was refined to the form it uses today by his henchman and Reich Minister of Propaganda, Joseph Goebbels. Goebbels was speaking of the British when he said, “follow the principle that when one lies, one should lie big, and stick to it. They keep up their lies, even at the risk of looking ridiculous.” An observation he put in to effect as “well” as anyone in human history to create the false narrative of an innocent, besieged Germany striking back at an “international Jewry” which started World War I and was consequently to blame for all of Germany’s suffering in the inter-war period. The Big Lie was the sales pitch that allowed the Nazis to get away with industrial genocide. If you tell a lie that’s big enough, and you tell it often enough, people will believe you are telling the truth, no matter how ridiculous or factually false your lie is under the light of critical scrutiny. This tactic must be countered at the sources: the causal oversimplifications and the repetition. Point out causal oversimplification and counter repetition with repetition.
Direct/Indirect Deception – There is always the possibility that someone will simply lie. The best way to deal with lies is to provide proof that they are lies. For example, when someone tries to sell you the idea that voter fraud is a real issue, present them with articles and statistics proving that voter fraud is a non-issue. Examples of indirect deception include (but are not limited to) obfuscation, intentional vagueness or confusion of topics. The best way to deal with indirect deception is a combination of proof and clarification.
Blaming the Victim/Apologetics – This is when excuses are made for bad actions by trying to rationalize away the actions of the bad actor. A variation on the tactic of scapegoating and false causal analysis.
Logical Fallacies: Definitions, Examples and How They Relate to Tactics
When addressing how logical fallacies relate to propaganda tactics, it is important to distinguish between formal logical fallacies and informal logical fallacies. It is often the misconception that an informal fallacy isn’t as serious a logical flaw as a formal fallacy, but that is because people often default to the common parlance in considering the difference between the words “formal” and “informal”. When discussing logic, these two words are not synonyms for “fancy” and “casual”, but instead “formal” means the fallacy applies directly to the form of the argument where “informal” covers fallacies that may look formally sufficient but fail for other reasons (usually related to the argument’s content). For the sake of consistency, the definitions for the following fallacies are all derived from Wikipedia – primarily because the definitions supplied conformed substantially to those provided by other sources and secondarily because it was the only source I found that covered every fallacy I wanted to address.
Formal fallacies come in four basic flavors: fallacies in the form of the argument, propositional fallacies, formal syllogistic fallacies and quantification fallacies. Formal fallacies are all types of non sequitur. This list is not inclusive, but it is representative of the most common forms of these fallacies you are likely to encounter in propaganda.
Fallacies in the Form of the Argument
- Appeal to probability – assumes that because something is likely to happen, it is inevitable that it will happen. Of all the formal fallacies, this is probably the most commonly used in propaganda. The best defense is to always examine verb choice: could versus should, might versus will, etc. Ask is this assertion framed in the language of possibility, probability or certainty? Example(s):
It might rain today. (possibility)
It is likely to rain today. (probability)
It will rain today. (certainty)
This fallacy also manifests in weak analogies.
Van Gogh was an artistic genius.;
Van Gogh died a poor artist.;
I am a poor artist.;
∴ I am an artistic genius.
- Argument from fallacy – This fallacy assumes that if an argument or premise for some conclusion is fallacious, then the conclusion itself is false. This is deceptively simple. The error rests not in the consequence of an argument but rather in assuming that the consequent is fallacious simply because one (or more) of the premises is fallacious. Example:
If P then Q;
P is a fallacious argument;
∴ Q is fallacious.
Consider two people are arguing. Person A says, “All chimps are animals. Bonzo is an animal. Therefore Bonzo must be a chimp.” Person B points out that Person A is affirming the consequent which is a logical fallacy and therefore Bonzo is not a chimp. Person B is in fact arguing from fallacy because the facts/premises of Person A’s argument are inconclusive as to whether Bonzo is a chimp or not because “animals” is a very large set and there is insufficient evidence that Bonzo as a part of that set belongs to the sub-set “chimps”. Person C notes that Person B’s argument and says, “B’s assumption that Bonzo is not a chimp is an argument from fallacy, therefore Bonzo must be a chimp”. Person C is also arguing from fallacy. In other words, simply pointing out a fallacy does not automatically prove your point. While pointing out fallacies is a good exercise in logic and can be used as a legitimate or illegitimate form of ad hominem attack to undermine the credibility of a speaker, it must be paired with a valid substantive counter-argument to make your case. The best defense for this tactic is to have valid substantive counter arguments to present in conjunction with pointing out fallacious reasoning (a true counter claim) or be willing to admit that an argument using fallacious reasoning can still have a correct consequent (not a true counter claim but merely a criticism of form).
- Masked man fallacy (illicit substitution of identicals) – the substitution of identical designators in a true statement can lead to a false one. Although the individual premises may be true, this fallacy is fallacious because knowing and believing are not equivalent. Example;
I know X;
I don’t know Y;
∴ X is not Y.
I know who Prof. Turley is;
I don’t know who the Donut Bandit is;
Therefore, Prof. Turley is not the Donut Bandit.
A propositional fallacy is an error in logic that concerns compound propositions. In order for a compound proposition to be true all the simple propositions in it have to be true and validly related as the logical connector (and, or, not, only if, if and only if) suggests.
- Affirming a disjunct – concluded that one logical disjunction must be false because the other disjunct is true:
A or B;
∴ not B.
- Affirming the consequent – the antecedent in an indicative conditional is claimed to be true because the consequent is true:
If A, then B;
- Denying the antecedent – the consequent in an indicative conditional is claimed to be false because the antecedent is false:
If A, then B;
∴ not B.
Formal Syllogistic Fallacies
All equines are mammals;
Ponies are equines;
Therefore all ponies are mammals.
All syllogistic arguments are constrained by quantifiers and copula. Quantifiers are terms that modify the subject either universally (like “all” or “no”) or in particular (like “some”, “many” or “few”). Copula are words that connect subject and predicates, usually a verb and usually some form of “to be”. Because of the form and operands of this form of logic, it is subject to certain kinds of formal fallacies.
- Affirmative conclusion from a negative premise (illicit negative) – When a categorical syllogism has a positive conclusion, but at least one negative premise. Example:
No pigs are cats;
No cats can fly;
∴ Pigs can fly.
- Fallacy of exclusive premises – When a categorical syllogism that is invalid because both of its premises are negative. Example:
No mammals are fishes.;
Some fishes are not whales.;
∴ some whales are not mammals.
- Fallacy of four terms (quaternio terminorum) – When a categorical syllogism that has four (or more) terms. In its classic form, a syllogism has three terms. For example in the argument . . .
All equines are mammals;
Ponies are equines;
Therefore all ponies are mammals.
. . . the three terms are “equines”, “mammals” and “ponies” and the argument is formally correct. However, if you change it to read . . .
All equines are mammals;
Ponies are equines;
Therefore all snakes are mammals.
. . . you have a fourth and undistributed term with “snakes”. Two premises are not sufficient to connect four terms and all the premises must have a common element. In propaganda, this kind of formal error most often manifests informally in the form of equivocation. Consider the following example . . .
The pen touches the paper.;
The hand touches the pen.;
∴ The hand touches the paper.
In this example of equivocation, what is the forth term? It looks at first glance like there isn’t one, doesn’t it? You see “pen”, “paper” and “hand” plainly enough, but where is the fourth term? It is in the word “touches” which is being used to have two meanings (the equivocation). Substitute the words “is touching” for “touches” and see the difference . . .
The pen is touching the paper.;
The hand is touching the pen.;
∴ The hand is touching the paper.
Clearly the hand touching the pen is not the pen itself and the four terms are revealed as “the hand”, “touching the pen”, “the pen”, and “touching the paper”.
- Illicit major – This categorical syllogism is invalid because its major term is not distributed in the major premise but distributed in the conclusion. Example:
All A are B.;
No C are A.;
∴ No C are B.
All roses are flowers.
No daisies are roses.
Therefore no daises are flowers.
- Illicit minor – This categorical syllogism is invalid because its minor term is not distributed in the minor premise but distributed in the conclusion. Example:
All A are B.;
All A are C.;
∴ all C are B.
All ponies are equines.
All ponies are mammals.
Therefore all mammals are equines.
- Negative conclusion from affirmative premises (illicit affirmative) – This is when a categorical syllogism has a negative conclusion but affirmative premises. For example:
All cars are Camaros.;
All Camaros are made by GM.;
Therefore, no cars are made by GM.
- Fallacy of the undistributed middle – When the middle term in a categorical syllogism is not distributed. Example:
All A are B.;
All C are B.;
∴ All C are A.
All bass are fish.
All carp are fish.
Therefore all carp are bass.
The common term of the premises, “B” or “fish” is undistributed. This is a corollary for another rule of logic – anything distributed in the conclusion must be distributed in at least one premise. This is not to be confused with Aristotle’s Law of Thought, the Law of the Excluded Middle.
A quantification fallacy is an error in logic where the quantifiers of the premises are in contradiction to the quantifier of the conclusion. It is really a specialized form of syllogistic fallacy related to the quantity terms “all” and “no” and in most circumstances will be seen in what is known as the existential fallacy where an argument has two universal premises and a particular conclusion. It bears particular note because it is a fallacy often used by extremists and those locked in binary thinking. This kind of fallacy is not rooted in Aristotelian logic (term logic), but in Boolean logic (propositional logic). In deference to rafflaw’s well known aversion to mathematics, I won’t delve too deeply into this subject other than to note that when you see an argument like this:
All inhabitants of other planets are friendly beings.
All Martians are inhabitants of another planet.
Therefore, all Martians are friendly beings.
Some Martians are friendly beings.
Or like this . . .
All people are superior beings.
All white people are humans.
∴ All white people are superior beings.
Really, only some white people are superior beings (and those are the ones who recognize white superiority).
It is an argument making a quantification fallacy.
In the analysis of propaganda, informal fallacies are often more utilized than formal fallacies. The reasons why are fairly obvious – they don’t stand out like formal fallacies do and their very appearance of formal sufficiency helps to further obscure the untruths being peddled. There are a wide variety of informal logical fallacies and while this list isn’t exhaustive, it is comprehensive and representative of the fallacies most often used in propaganda. There are four broad groupings: informal fallacies, faulty generalizations, red herring fallacies and conditional fallacies.
- Argument from ignorance (argumentum ad ignorantiam) – The assumption that a claim is true (or false) because it has not been proven false (true) or cannot be proven false (true). Example:
There is life on Europa.
(We simply don’t have enough information to prove this statement true or false.)
- Argument from repetition (argumentum ad nauseam) – An argument that has been discussed extensively until nobody cares to discuss it further. Do I really need to give an example for this one?
- Argument from silence (argumentum e silentio) – Where the conclusion is based on silence of opponent, failing to give proof, based on “lack of evidence”. This is not the same thing as a opponent failing to properly meet their burden of proof (which is not a fallacy, but a formal failure in argumentation).
- Begging the question (petitio principii) – When the conclusion of an argument is implicitly or explicitly assumed in one of the premises.
- Circular cause and consequence – When the consequence of a phenomenon is claimed to be its root cause.
- Correlation does not imply causation (cum hoc ergo propter hoc) – A faulty assumption that correlation between two variables implies that one causes the other.
- Equivocation– This is the misleading use of a term with more than one meaning (by glossing over which meaning is intended at a particular time).
- Etymological fallacy – In which reasons that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day meaning. This fallacy can (like many) be inverted to argue that the modern usage for a word is not the same as the historical usage of a word. For example, you very often see Randians or Libertarians trying to argue that the word “welfare” had a different meaning at the time of the drafting of the Constitution than it does now to argue against spending on the “general welfare” as mandated by the Constitution. This is not only an etymological fallacy, it is an outright lie as the contemporaneous Webster’s Dictionary to the drafting of the Constitution defines “welfare” as “Well-doing or well-being; enjoyment of health and the common blessings of life. Syn. – Prosperity; happiness.” (An American Dictionary of the English Language (2 volumes; New York: S. Converse, 1828), by Noah Webster, p. 815) which comports perfectly well with the modern usage of “welfare” in Websters.
- Fallacy of composition – This is assuming that something true of part of a whole must also be true of the whole.
- Fallacy of division – assuming that something true of a thing must also be true of all or some of its parts. This is closely related to the Ecological fallacy where inferences about the nature of specific individuals are based solely upon aggregate statistics collected for the group to which those individuals belong.
- False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – Where two alternative statements are held to be the only possible options, when in reality there are more. You see this a lot from extremists and binary thinkers. This fallacy does violate Aristotle’s Law of the Excluded Middle in “third options” are omitted. Example:
“Either you’re with us or against us.” – G.W. Bush
This omits a third option, “I’m with the country, but against your unconstitutional tactics and improper selection of targets.”
- Fallacy of many questions (complex question, fallacy of presupposition, loaded question, plurium interrogationum) – This is when a speaker asks a question that presupposes something that has not been proven or accepted by all the people involved. This fallacy is often used rhetorically, so that the question limits direct replies to those that serve the questioner’s agenda.
- Fallacy of the single cause (causal oversimplification) – When it is assumed that there is one, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes.
- False attribution– When an advocate appeals to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument. This is related to contextomy (the fallacy of quoting out of context), which refers to the selective excerpting of words from their original context in a way that distorts the source’s intended meaning and to the red herring fallacy of appeal to authority discussed below with the other red herring fallacies.
- Argument to moderation (false compromise, middle ground, fallacy of the mean) – This is assuming that the compromise between two positions is always correct.
- Historian’s fallacy – This occurs when one assumes that decision makers of the past viewed events from the same perspective and having the same information as those subsequently analyzing the decision. Originalist argments about the meaning of the Constitution are often mired in the Historian’s fallacy. This is related to, but not to be confused with, presentism which is a mode of historical analysis in which present-day ideas, such as moral standards, are projected into the past.
- Homunculus fallacy – This where a “middle-man” is used for explanation, this usually leads to regressive middle-man. Explanations without actually explaining the real nature of a function or a process. Instead, it explains the concept in terms of the concept itself, without first defining or explaining the original concept.
- Incomplete comparison – When not enough information is provided to make a complete comparison. This is a form of lie of omission.
- Inconsistent comparison – When different methods of comparison are used, leaving one with a false impression of the whole comparison. The old “apples versus oranges” scenario.
- Ignoratio elenchi (irrelevant conclusion, missing the point) – When an argument that may in itself be valid, but does not address the issue in question. You see this a lot with threadjacking and attempts to change the subject.
- Kettle logic – This using multiple inconsistent arguments to defend a position. Sometimes called “crazy talk”, it can be particularly disjointed.
- Mind projection fallacy – This is when a speaker considers the way he sees the world as the way the world really is, usually in the face of contrary evidence. You see this a lot with historical revisionists like far right Fundamentalists who want to insist this is a Christian nation despite overwhelming historical evidence that the United States was purposefully created as a secular nation with a secular government.
- Moving the goalposts (raising the bar) – This is when argument in which evidence presented in response to a specific claim is dismissed and some other (often greater) evidence is demanded.
- Nirvana fallacy (perfect solution fallacy) – This is when solutions to problems are rejected because they are not perfect. You also see this a lot from Libertarian types who want to argue that because regulation is an imperfect solution then it isn’t a valid solution to white collar crime.
- Shifting the Burden of Proof – “Onus probandi incumbit ei qui dicit, non ei qui negat” – the burden of proof is on the person who makes the claim, not on the person who denies (or questions the claim). It is a particular case of the “argumentum ad ignorantiam” fallacy, where the burden is shifted on the person defending against the assertion.
- Post hoc ergo propter hoc is Latin for “after this, therefore because of this” (false cause, coincidental correlation, correlation without causation) – X happened then Y happened; therefore X caused Y. A variation of correlation is not causation.
- Proof by verbosity (argumentum verbosium, proof by intimidation) – When submission of others to an argument too complex and verbose to reasonably deal with in all its intimate details. This is related to shotgun argumentation when the arguer offers such a large number of arguments for their position that the opponent can’t possibly respond to all of them.
- Psychologist’s fallacy – When an observer incorrectly presupposes the objectivity of his own perspective when analyzing a behavioral event. Consider the person who cannot (as opposed to “does not but can”) address certain topics without succumbing to emotion laden argument or histrionic language. “Think of the children!” is very often not just an appeal to emotion but an example of this fallacy as well.
- Regression fallacy – This is when ascribes cause where none exists. The flaw is failing to account for natural fluctuations. It is frequently a special kind of the post hoc fallacy.
- Reification (hypostatization) – A fallacy of ambiguity found when an abstraction (abstract belief or hypothetical construct) is treated as if it were a concrete, real event or physical entity. In other words, it is the error of treating as a “real thing” something which is not a real thing, but merely an idea. For example, arguments that treat good and/or evil as if they were tangible (and usually absolute) properties of the universe instead of concepts defined within the parameters of social context.
- Retrospective determinism – This is the argument that because some event has occurred, its occurrence must have been inevitable beforehand. This is the cousin of outcome determinism where a desired outcome is reasoned as inevitable based on incomplete, inconsistent and/or cherry picked premises.
- Special pleading – This is where a proponent of a position attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption. This is found at the heart of most Randian arguments where the wealthy are due special treatment like exemption from laws because they “create jobs” (which is utter nonsense according to the evidence).
- Wrong direction – When cause and effect are reversed, cause is said to be the effect or vice versa or when causation is not clear so it it confused with effect.
Faulty generalizations – reach a conclusion from weak premises. Unlike fallacies of relevance, in fallacies of defective induction, the premises are related to the conclusions yet only weakly buttress the conclusions.
- Accident– When an exception to a generalization is ignored.
- No true Scotsman – When a generalization is made true only when a counterexample is ruled out on shaky grounds.
- Cherry picking (suppressed evidence, incomplete evidence) – The act of pointing at individual cases or data that seem to confirm a particular position, while ignoring a significant portion of related cases or data that may contradict that position, this tactic was detailed above.
- False analogy – This is when an argument by analogy in which the analogy is poorly suited. A variation of the “apples versus oranges” scenario.
- Hasty generalization (fallacy of insufficient statistics, fallacy of insufficient sample, fallacy of the lonely fact, leaping to a conclusion) – This is when a broad conclusion on a small sample. For example:
Some crazy people have been obsessed with movies.
Some crazy people act violently.
Therefore all violent crazy people are obsessed with movies.
- Misleading vividness – This involves describing an occurrence in vivid detail, even if it is an exceptional occurrence, to convince someone that it is a problem. Just watch the Zimmerman case as it unfolds and some of the threads on this blog surrounding it and you’ll find plenty of examples of misleading vividness (especially from the “Friends of George”).
- Overwhelming exception – An accurate generalization that comes with qualifications which eliminate so many cases that what remains is much less impressive than the initial statement might have led one to assume.
- Thought-terminating cliché – Often a commonly used phrase, sometimes passing as folk wisdom, used to quell cognitive dissonance, conceal lack of thought, change the subject, etc., that is used to attempt to end the debate with a cliché instead of a conclusive logical point and/or evidence.
Red Herring Fallacies
A red herring fallacy is an error in logic where a proposition is, or is intended to be, misleading in order to make irrelevant or false inferences or change the subject. In the general case any logical inference based on fake arguments, intended to replace the lack of real arguments or to replace implicitly the subject of the discussion. There are many informal fallacies that qualify as red herring fallacies.
- Ad hominem– Using arguments attacking the arguer instead of the argument. This is often a fallacy, but not always, the exception being when the speaker is making statements without evidence and relying upon their character as proof of veracity of a statement.
- Poisoning the well – A type of ad hominem where adverse information about a target is presented with the intention of discrediting everything that the target person says.
- Abusive fallacy – A subtype of “ad hominem” when it turns into name-calling rather than arguing about the originally proposed argument, it can also be a form of transfer or guilt by association.
- Argumentum ad baculum (appeal to force) – When an argument is made through coercion or threats of force to support position, a.k.a. “Agree with me or I’ll kick your ass.”
- Argumentum ad populum (appeal to belief, appeal to the majority, appeal to the people) – This is where a proposition is claimed to be true or good solely because many people believe it to be so. Just because a proposition is popular doesn’t automatically mean that it is true or good, but conversely just because a proposition is popular doesn’t mean automatically that it is false or bad.
- Appeal to equality – When an assertion is deemed true or false based on an assumed pretense of equality. Very often seen the arguments of those in favor of privatization of government services in their assumption that in contracting, all parties are equal because the transaction has the appearance of being voluntary.
- Appeal to authority– When an assertion is deemed true because of the position or authority of the person asserting it.
- Appeal to accomplishment – When an assertion is deemed true or false based on the accomplishments of the proposer, i.e. appealing to your own authority.
- Appeal to consequences (argumentum ad consequentiam) – When the conclusion is supported by a premise that asserts positive or negative consequences from some course of action in an attempt to distract from the initial discussion. See variations of the “Think of the children!” defense for examples of this fallacy.
- Appeal to emotion– where an argument is made due to the manipulation of emotions, rather than the use of valid reasoning as detailed above.
- Appeal to novelty (argumentum ad novitam) – where a proposal is claimed to be superior or better solely because it is new or modern.
- Appeal to poverty (argumentum ad Lazarum) and appeal to wealth (argumentum ad crumenam) – Supporting a conclusion because the arguer is poor or refuting because the arguer is wealthy and vice versa.
- Appeal to tradition (argumentum ad antiquitam) – When a conclusion is supported solely because it has long been held to be true. “We’ve always done it this way.”
- Genetic fallacy – When a conclusion is suggested based solely on something or someone’s origin rather than its current meaning or context. Again, see Originalist Constitutional arguments to see plenty examples of this kind of fallacy.
- Naturalistic fallacy (is–ought fallacy) – When claims are about what ought to be instead of on the basis of statements about what is.
- Reductio ad Hitlerum (playing the Nazi card, Godwin’s law) – Comparing an opponent or their argument to Hitler or Nazism in an attempt to associate a position with one that is universally reviled, this is only a fallacy when you are not actually discussing or arguing traits, behaviors and practices that are not in common with those of the Nazi Party or Hitler. For example, calling someone a Nazi because they would favor a law telling you what to do is a Godwin’s law violation, but calling someone a Nazi because they favor rounding up segments of society into camps is not.
- Straw man – When an argument based on misrepresentation of an opponent’s position, this tactic is a favorite of propagandists and trolls everywhere. For example, see Mitt Romney’s current misrepresentations about Obama’s statements concerning the small business. Mitt’s camp has taken words out of context to build a straw man and attack it.
- Tu quoque (“you too”, appeal to hypocrisy) – When the argument states that a certain position is false or wrong and/or should be disregarded because its proponent fails to act consistently in accordance with that position. Just because a smoker tells you smoking is bad and you should quit is not false or wrong simply because they are still a smoker.
- Black swan blindness – When the argument ignores low probability, high impact events, thus down playing the role of chance and under-representing known risks, this fallacy is roughly the inverse of the appeal to probability.
- Broken window fallacy – When an argument which disregards lost opportunity costs (typically non-obvious, difficult to determine or otherwise hidden costs) associated with destroying property of others, or other ways of externalizing costs onto others. For example, an argument that states breaking a window generates income for a window fitter, but disregards the fact that the money spent on the new window cannot now be spent on new shoes.
- Naturalistic fallacy – When attempts to prove a claim about ethics by appealing to a definition of the term “good” in terms of either one or more claims about natural properties (sometimes also taken to mean the appeal to nature). This is akin to the fallacy of reification and cousin to the pathetic fallacy (when inanimate objects are imbued with the qualities of living beings).
- Slippery slope – When asserting that a relatively small first step inevitably leads to a chain of related events culminating in some significant impact, this topic has come up before on this blog.
These are some of the tactics of propaganda you should be conscious of when consuming propaganda.
Note: This column was originally published at Res Ispa Loquitur (jonathanturley.org) on July 29, 2012. It has been re-edited for presentation here.