Propaganda 104: Magica Verba Est Scientia Et Ars Es

Crossing out Lies and writing Truth on a blackboard.By GENE HOWINGTON

“Words have a magical power. They can bring either the greatest happiness or deepest despair; they can transfer knowledge from teacher to student; words enable the orator to sway his audience and dictate its decisions. Words are capable of arousing the strongest emotions and prompting all men’s actions.” – Sigmund Freud

“One man’s ‘magic’ is another man’s engineering. ‘Supernatural’ is a null word.” – Robert A. Heinlein

Words are magic . . . or so it seems. Words can make people change their minds. Words can make others take actions even against their own best interests. Words can shape the world, determine the fate of nations and people, create and destroy. However, as Robert Heinlein noted, one man’s magic is another man’s engineering and in the modern world, propaganda is the most engineered form of communication possible.

Magica verba est scientia et ars es.

The magic of words is science and art.

The science is in the methodology and psychology of execution. The art is in making the message appealing. This is the essence of rhetoric. How is this so?  Let us first consider the methodologies of propaganda as a form of rhetoric before we look at the psychology behind these tactics. Although the psychology applies to both negative (black), positive (white) and value neutral (grey) uses of propaganda, in the context of this portion of the discussion, the word “propaganda” should be viewed with its maximum possible negative value load, i.e. the kind of bad propaganda designed to get you to act against your best interests or to harm others. Why? Because many of these tactics favored  modern political polemicists are rooted in logical fallacies and outright lies. Knowing “snakes” as a category isn’t as useful as knowing “pit vipers” as a sub-category when the survival of the species can be at stake so we’ll consider the dangerous kinds of propaganda first. Why? Because if you treat all snakes like they are dangerous, then you are less likely to get bitten.

The Façade of Reason – The Role of Logical Fallacies in Propaganda

First, we need to differentiate between the terms “strategy” and “tactics”.  Strategy is defined in relevant part by Webster’s as “the science and art of employing the political, economic, psychological, and military forces of a nation or group of nations to afford the maximum support to adopted policies in peace or war”.  Tactics, by contrast, is defined in relevant part by Webster’s as “the art or skill of employing available means to accomplish an end” and “the study of the grammatical relations within a language including morphology and syntax”. By better understanding the tactics of propagandists, you not only gain a certain degree of immunity from their influence, but insight into their strategic ends.

Many of these tactics rely upon logical fallacies. Etymologically speaking, most everyone knows that fallacies are falsehoods, but for the purposes of this discussion consider again Webster’s definition in relevant part in that a fallacy is “an often plausible argument using false or invalid inference”. This is one of the reasons logicians are arduously trained to spot fallacies and why they are so dangerous to the consumer. Logical fallacies but especially informal logical fallacies can provide a façade of reason, a mask of legitimacy, to an argument but are in fact logically and/or factually flawed. They accomplish this by being subtle flaws and/or appealing to naturally occurring predispositions in human psychology. This can range from being simply wrong or mistaken to deliberate lies depending upon the speaker and their possible motivations for inciting you to adopt their stance on an issue or in taking or failing to take a given action. Illogic is simply illogic unless the speaker is intentionally trying to be misleading. Not everyone who is illogical is a propagandist, but at some level, every propagandist preys upon illogic and uses illogic as a tool to convince you of a reality that does not exist.

By knowing the tactics and methodology of propagandists, you can deconstruct their statements, allowing you to sort through the truths and the lies; to think about the issues as they are without the filter of their perceptions and goals steering your thinking. By deconstruction of their statements, you can find the truth. Truthful decisions, no matter how ugly the truth might be, are always better than misinformed decisions or decisions made upon prime facie lies. Although a famous Roman Emperor once said that truth is a perspective, he also said that . . .

“If someone is able to show me that what I think or do is not right, I will happily change, for I seek the truth, by which no one was ever truly harmed. It is the person who continues in his self-deception and ignorance who is harmed.”― Marcus Aurelius

The Tactics of Propaganda

Name Calling and Labeling/Mislabeling – Although both name calling and labeling tactics are common, I think they are best understood when consolidated under the term of “mislabeling”. Labeling in and of itself has utility. To return to the wisdom of Marcus Aurelius, ask of each and every thing what is it in itself. To that end, an accurate label is a summation, the encapsulation of an idea. Where we run into trouble is when labels are misapplied or used solely to conjure a negative implicit or explicit relationship. When someone engages in this tactic (or is the victim of it), look first at the denotation of the word(s) being used. Are they accurate? It is not name calling when you describe someone acting in a sociopathic manner a sociopath.  It is merely accurate if that is consistent with the behavior the person in question displays. If the label being applied is inaccurate, then that is your first hint that it is mislabeling and the speaker’s motivation should be suspect. A good way to deal with this tactic is to turn it back upon the user either directly or by deconstruction and clarification; make definitions – preferably objective definitions from credible sources – work for you and against them. This tactic is common on blogs and this counter-tactic is best suited for such an interactive environment, however, it is practiced elsewhere in media.  For example, anti-abortion articles that refer to doctor who provide that legal and necessary service as “murderers”.

Even if the denotation of the word or words is accurate, ask yourself if there is a negative connotation to the word being used? For example, in modern American English, saying someone is a black man is an accurate term if that man is indeed ethnically black and would not raise an eyebrow under normal circumstances (context matters, but we are talking about labels only at this time). Now consider if that same speaker used the term “colored man”? If you stick to the strictest meaning of the word “colored” as defined by Webster’s (“having color”), then this may be an accurate label as applied to a black man. However, if you consider the broader meaning of the word “colored”, you’d know that using that word to describe persons of races other than the white or of mixed race is often – in my experience always – considered offensive. It carries a negative connotation of diminution, an implication of inferiority based on skin color. Of course, this is nonsense, but it is an example of a connotation being put to bad ends. This should also lead you to question the speaker’s motives.

Loaded Language – At the beginning of this series, we looked at the value of word choice and how denotation and connotation could be manipulated to bias the “value load” of language. Technically this process is referred to as using euphemisms or dyphemisms. A euphemism is when you change the value of load of a word positively by substituting a less harsh word, for example using the term “police action” instead of “war”. A dysphemism is when you change the value load of a word negatively by substituting a harsher or even profane word, for example calling LEOs “pigs” instead of “police”. The best defense against loaded language is to have a broad vocabulary and a willingness to use both a dictionary and a thesaurus when you see words that elicit an emotional response. Group antonyms (and synonyms) according to their perceived value as “virtue” words (euphemistic value) or “devil” words (dysphemistic value). Considering denotation as well as other alternative word choices can be quite illuminating in deciding whether the speaker is using purposefully loaded language.  A good counter-tactic for loaded language is identification of the tactic and clarification of terms.

Both labeling and loaded language can be considered forms of transfer or the logical fallacy of guilt by association.

Generalities – Speaking in generalities is an artful form of the logical fallacies of composition, division or false equivalences in that statements are made in general, vague or inadequate manner to state a truth about a part based on the whole, about the whole based on a part or about a part or a whole by creating a false connotative connection that isn’t causal.  When a generality crosses into propaganda, it often crosses in to the realm of the informal logical fallacy, the faulty generalization, but those are very specific kinds of fallacies and they are listed and addressed in detail below.  Another form is the use “glittering generalities” where the word choice is based entirely on positive emotional appeal but don’t have any real substance.  This is sometimes called the P.T. Barnum tactic or the “baffle them with bull” tactic after the famous W.C. Fields quote, “If you can’t dazzle them with brilliance, baffle them with bull.” The best way to combat generalities is by logical dissection of the argument and comparing the objective nature of the evidence to the assertions made to see if there is causal connection or not.

Transfer/Guilt by Association and False Equivalence – Actually two distinct tactics, they are grouped together because often one is used to create the other, i.e. a false equivalence can be used to create guilt by association or vice versa. Consider the recent revelations about the Sandusky/Penn State/Second Mile child abuse scandal and how the taint of Sandusky’s crimes has (rightfully) spread to others and other organizations or the Idaho billboards (wrongfully) comparing Obama to the Aurora shooter. This is a form of red herring fallacy, but it is so prevalent in propaganda it merits separate mention.

Cherry Picking (Selective Truth) – This tactic can combine several approaches to one net effect: biasing data. This can be done by incomplete comparisons, inconsistent comparisons, quoting out of context, appealing to authority, and causal oversimplification just to name a few of the tactics that can be used to cherry pick. It is in summary selecting data that supports a position while ignoring data that refutes a position.  The best defense against this tactic is to always ask “is this all the relevant information?”

False Causal Analysis – This tactic is where informal logical fallacies really come into their own by using combinations of tactics to create false causes for consequents. For example, consider the “arguments” by Rep. Louie Gohmert (R-Texas) that the tragedy in Aurora was a result of “ongoing attacks on Judeo-Christian beliefs” which uses outright deception combined with an appeal to emotion with a dash of bandwagon and pandering. This is a false causal analysis using multiple tactics/fallacies.

Appeals to Emotion – This tactic is in itself an informal logical fallacy, a form of faulty generalization. There are two basic groupings here: Appeals to Pure Emotion (i.e. a direct appeal designed to elicit a particular emotion) and Appeals to Distilled Emotion (i.e. indirect appeals to a set of emotions). The lists below are not all inclusive, but they highlight the most commonly used emotions targeted by speakers.

Appeals to Pure Emotion: Fear, Anger, Humor, Sentiment/Nostalgia, Pity, Flattery, Ridicule, and Spite

Appeals to Distilled Emotion – These appeals to emotion are more complex than the pure appeals.  They will be addressed in greater depth in the section on the psychology of propaganda, but they merit mention now with their rhetorical cousin(s).

  • Testimonial – Less an appeal to pure emotion, this tactic rooted is in Social Proofing and Authority .
  • Plain Folks – Sometimes confused with the Band Wagon tactic, the Plain Folks tactic is actually a variation on the same psychological mechanisms that Band Wagon relies upon – Social Proofing, Liking and Authority.
  • The Desire for Certainty – This plays to the human mind’s propensity to deal with uncertainty by filling in factual gaps in knowledge with beliefs and suppositions it treats as fact to allow the mind to come to a certain conclusion where there may not be one.  This is closely related to the Desire for Consistency.
  • Wishful Thinking – When a decision is made according to what might be pleasing to imagine, rather than according to evidence or reason. It is a form of delusional thinking and is closely related to another psychological mechanism – denial.

The best way to address appeals to emotion is to identify them as such and then illustrate why an emotional response will not be helpful to resolving the issue in question, preferably while offering a rational and viable alternative solution. The ability to read and write with detachment as well as the ability to apply logic dispassionately aids in dealing with appeals to emotion.

Band Wagon and False Consensus – The Band Wagon tactic is a variation of the appeal to popularity (argumentum ad populum) that is best illustrated by the line, “All the really cool kids are doing it.”  It is, like that illustrative line, a form of peer pressure. Where it comes into full bloom as a propaganda tactic is when it is combined with false consensus.  False consensus is a tactic where posters use “sockpuppet” identities to basically agree with themselves and create the false illusion that the proposition is good by consensus.  The best way to combat this tactic is to familiarize yourself with poster’s writing styles so sockpuppets are easier to spot, frequent sites that use moderation to mitigate the effects of sockpuppetry.  If you have the skills, you might even write analytical software that spots sockpuppets based on published public data.  It has been my experience on this particular blog and topic though that when people use technological solutions to spot sockpuppets, the sockpuppeteers get really pissed off at not being able to lie without the threat of easy detection of their tactics. While this tactic may be overboard, I have found that simply being able to recognize writer’s by “their voice” works almost as well. This is a skill that you may or may not be able to acquire though. It’s analogous to having an ear for dialects. You simply may or may not have the proclivity and predisposition to do this effectively, but it never hurts to try.  When combined, these two tactics are illustrative of the tactic known as astroturfing.  These tactics can also be used in divide and conquer strategies.

Red Herrings – This tactic actually covers an broad range of informal logical fallacies that all amount to one move tactically speaking: misdirection.  This is addressed below in greater detail along with the other informal logical fallacies.

Simple Solutions/Repetition/The Big Lie – This group of tactics is interrelated but they are all stand alone tactics in their own right. First, the Fallacy of the Single Cause or causal oversimplification, is exactly what it sounds like: making the cause of a premise or conclusion simpler than it actually is to avoid or obscure other causal factors that might steer the argument in other directions. This can take several forms including improper definitions, “pat” answers and binary thinking. This works because simple bits of information are more readily absorbed by consumers than complex concepts. To make an idea, simple or otherwise, “stick” to a consumer, repetition works. This is because of the psychology of operant conditioning or learning by imitation. Monkey see, monkey do. You see it in action every day with advertising. When you combine simple solutions and repetition you get a third tactic that is far more dangerous than either upon their own: the Big Lie.  The term Big Lie (Große Lüge) was coined by Adolph Hitler in Mein Kampf, but the tactic was refined to the form it uses today by his henchman and Reich Minister of Propaganda, Joseph Goebbels. Goebbels was speaking of the British when he said, “follow the principle that when one lies, one should lie big, and stick to it. They keep up their lies, even at the risk of looking ridiculous.” An observation he put in to effect as “well” as anyone in human history to create the false narrative of an innocent, besieged Germany striking back at an “international Jewry” which started World War I and was consequently to blame for all of Germany’s suffering in the inter-war period. The Big Lie was the sales pitch that allowed the Nazis to get away with industrial genocide.  If you tell a lie that’s big enough, and you tell it often enough, people will believe you are telling the truth, no matter how ridiculous or factually false your lie is under the light of critical scrutiny. This tactic must be countered at the sources: the causal oversimplifications and the repetition.  Point out causal oversimplification and counter repetition with repetition.

Direct/Indirect Deception – There is always the possibility that someone will simply lie. The best way to deal with lies is to provide proof that they are lies.  For example, when someone tries to sell you the idea that voter fraud is a real issue, present them with articles and statistics proving that voter fraud is a non-issue. Examples of indirect deception include (but are not limited to) obfuscation, intentional vagueness or confusion of topics. The best way to deal with indirect deception is a combination of proof and clarification.

Blaming the Victim/Apologetics – This is when excuses are made for bad actions by trying to rationalize away the actions of the bad actor.  A variation on the tactic of scapegoating and false causal analysis.

Logical Fallacies: Definitions, Examples and How They Relate to Tactics

When addressing how logical fallacies relate to propaganda tactics, it is important to distinguish between formal logical fallacies and informal logical fallacies. It is often the misconception that an informal fallacy isn’t as serious a logical flaw as a formal fallacy, but that is because people often default to the common parlance in considering the difference between the words “formal” and “informal”.  When discussing logic, these two words are not synonyms for “fancy” and “casual”, but instead “formal” means the fallacy applies directly to the form of the argument where “informal” covers fallacies that may look formally sufficient but fail for other reasons (usually related to the argument’s content). For the sake of consistency, the definitions for the following fallacies are all derived from Wikipedia – primarily because the definitions supplied conformed substantially to those provided by other sources and secondarily because it was the only source I found that covered every fallacy I wanted to address.

Formal Fallacies

Formal fallacies come in four basic flavors: fallacies in the form of the argument, propositional fallacies, formal syllogistic fallacies and quantification fallacies.  Formal fallacies are all types of non sequitur.  This list is not inclusive, but it is representative of the most common forms of these fallacies you are likely to encounter in propaganda.

Fallacies in the Form of the Argument

  • Appeal to probability – assumes that because something is likely to happen, it is inevitable that it will happen.  Of all the formal fallacies, this is probably the most commonly used in propaganda.  The best defense is to always examine verb choice: could versus should, might versus will, etc.  Ask is this assertion framed in the language of possibility, probability or certainty?  Example(s):

Compare:

It might rain today. (possibility)

It is likely to rain today. (probability)

It will rain today. (certainty)

________________

This fallacy also manifests in weak analogies.

Van Gogh was an artistic genius.;

Van Gogh died a poor artist.;

I am a poor artist.;

∴ I am an artistic genius.

  • Argument from fallacy  – This fallacy assumes that if an argument or premise for some conclusion is fallacious, then the conclusion itself is false. This is deceptively simple. The error rests not in the consequence of an argument but rather in assuming that the consequent is fallacious simply because one (or more) of the premises is fallacious.  Example:

If P then Q;

P is a fallacious argument;

∴ Q is fallacious.

________________

Consider two people are arguing.  Person A says, “All chimps are animals. Bonzo is an animal.  Therefore Bonzo must be a chimp.”  Person B points out that Person A is affirming the consequent which is a logical fallacy and therefore Bonzo is not a chimp. Person B is in fact arguing from fallacy because the facts/premises of Person A’s argument are inconclusive as to whether Bonzo is a chimp or not because “animals” is a very large set and there is insufficient evidence that Bonzo as a part of that set belongs to the sub-set “chimps”.  Person C notes that Person B’s argument and says, “B’s assumption that Bonzo is not a chimp is an argument from fallacy, therefore Bonzo must be a chimp”.  Person C is also arguing from fallacy.  In other words, simply pointing out a fallacy does not automatically prove your point. While pointing out fallacies is a good exercise in logic and can be used as a legitimate or illegitimate form of ad hominem attack to undermine the credibility of a speaker, it must be paired with a valid substantive counter-argument to make your case.  The best defense for this tactic is to have valid substantive counter arguments to present in conjunction with pointing out fallacious reasoning (a true counter claim) or be willing to admit that an argument using fallacious reasoning can still have a correct consequent (not a true counter claim but merely a criticism of form).

  • Masked man fallacy (illicit substitution of identicals) – the substitution of identical designators in a true statement can lead to a false one.  Although the individual premises may be true, this fallacy is fallacious because knowing and believing are not equivalent.  Example;

I know X;

I don’t know Y;

∴ X is not Y.

________________

I know who Prof. Turley is;

I don’t know who the Donut Bandit is;

Therefore, Prof. Turley is not the Donut Bandit.

Propositional Fallacies

A propositional fallacy is an error in logic that concerns compound propositions. In order for a compound proposition to be true all the simple propositions in it have to be true and validly related as the logical connector (and, or, not, only if, if and only if) suggests.

A or B;

A;

∴ not B.

If A, then B;

B;

∴ A.

If A, then B;

not A;

∴ not B.

Formal Syllogistic Fallacies

Syllogistic fallacies – logical fallacies that occur in syllogisms.  Syllogistic reasoning is a basic tool of logic and comes in generally in the following form:

Major Premise;

Minor Premise;

Conclusion

________________

All equines are mammals;

Ponies are equines;

Therefore all ponies are mammals.

All syllogistic arguments are constrained by quantifiers and copula.  Quantifiers are terms that modify the subject either universally (like “all” or “no”) or in particular (like “some”, “many” or “few”).  Copula are words that connect subject and predicates, usually a verb and usually some form of “to be”. Because of the form and operands of this form of logic, it is subject to certain kinds of formal fallacies.

No pigs are cats;

No cats can fly;

Pigs can fly.

No mammals are fishes.;

Some fishes are not whales.;

some whales are not mammals.

  • Fallacy of four terms (quaternio terminorum) – When a categorical syllogism that has four (or more) terms.  In its classic form, a syllogism has three terms.  For example in the argument . . .

All equines are mammals;

Ponies are equines;

Therefore all ponies are mammals.

. . . the three terms are “equines”, “mammals” and “ponies” and the argument is formally correct.  However, if you change it to read . . .

All equines are mammals;

Ponies are equines;

Therefore all snakes are mammals.

. . . you have a fourth and undistributed term with “snakes”. Two premises are not sufficient to connect four terms and all the premises must have a common element. In propaganda, this kind of formal error most often manifests informally in the form of equivocation. Consider the following example . . .

The pen touches the paper.;

The hand touches the pen.;

The hand touches the paper.

In this example of equivocation, what is the forth term? It looks at first glance like there isn’t one, doesn’t it? You see “pen”, “paper” and “hand” plainly enough, but where is the fourth term? It is in the word “touches” which is being used to have two meanings (the equivocation). Substitute the words “is touching” for “touches” and see the difference . . .

The pen is touching the paper.;

The hand is touching the pen.;

The hand is touching the paper.

Clearly the hand touching the pen is not the pen itself and the four terms are revealed as “the hand”, “touching the pen”, “the pen”, and “touching the paper”.

  • Illicit major – This categorical syllogism  is invalid because its major term is not distributed in the major premise but distributed in the conclusion. Example:

All A are B.;

No C are A.;

No C are B.

________________

All roses are flowers.

No daisies are roses.

Therefore no daises are flowers.

  • Illicit minor – This categorical syllogism is invalid because its minor term is not distributed in the minor premise but distributed in the conclusion.  Example:

All A are B.;

All A are C.;

all C are B.

________________

All ponies are equines.

All ponies are mammals.

Therefore all mammals are equines.

All cars are Camaros.;

All Camaros are made by GM.;

Therefore, no cars are made by GM.

All A are B.;

All C are B.;

All C are A.

________________

All bass are fish.

All carp are fish.

Therefore all carp are bass.

The common term of the premises, “B” or “fish” is undistributed. This is a corollary for another rule of logic – anything distributed in the conclusion must be distributed in at least one premise. This is not to be confused with Aristotle’s Law of Thought, the Law of the Excluded Middle.

Quantification Fallacies

A quantification fallacy is an error in logic where the quantifiers of the premises are in contradiction to the quantifier of the conclusion. It is really a specialized form of syllogistic fallacy related to the quantity terms “all” and “no” and in most circumstances will be seen in what is known as the existential fallacy where an argument has two universal premises and a particular conclusion. It bears particular note because it is a fallacy often used by extremists and those locked in binary thinking. This kind of fallacy is not rooted in Aristotelian logic (term logic), but in Boolean logic (propositional logic). In deference to rafflaw’s well known aversion to mathematics, I won’t delve too deeply into this subject other than to note that when you see an argument like this:

All inhabitants of other planets are friendly beings.

All Martians are inhabitants of another planet.

Therefore, all Martians are friendly beings.

Some Martians are friendly beings.

Or like this . . .

All people are superior beings.

All white people are humans.

All white people are superior beings.

Really, only some white people are superior beings (and those are the ones who recognize white superiority).

It is an argument making a quantification fallacy.

Informal Fallacies

In the analysis of propaganda, informal fallacies are often more utilized than formal fallacies. The reasons why are fairly obvious – they don’t stand out like formal fallacies do and their very appearance of formal sufficiency helps to further obscure the untruths being peddled. There are a wide variety of informal logical fallacies and while this list isn’t exhaustive, it is comprehensive and representative of the fallacies most often used in propaganda.  There are four broad groupings: informal fallacies, faulty generalizations, red herring fallacies and conditional fallacies.

Informal Fallacies

  • Argument from ignorance (argumentum ad ignorantiam) – The assumption that a claim is true (or false) because it has not been proven false (true) or cannot be proven false (true).  Example:

There is life on Europa.

(We simply don’t have enough information to prove this statement true or false.)

  • Argument from repetition (argumentum ad nauseam) – An argument that has been discussed extensively until nobody cares to discuss it further. Do I really need to give an example for this one?
  • Argument from silence (argumentum e silentio) – Where the conclusion is based on silence of opponent, failing to give proof, based on “lack of evidence”. This is not the same thing as a opponent failing to properly meet their burden of proof (which is not a fallacy, but a formal failure in argumentation).
  • Begging the question (petitio principii) – When the conclusion of an argument is implicitly or explicitly assumed in one of the premises.
  • Circular cause and consequence – When the consequence of a phenomenon is claimed to be its root cause.
  • Correlation does not imply causation (cum hoc ergo propter hoc) – A faulty assumption that correlation between two variables implies that one causes the other.
  • Equivocation– This is the misleading use of a term with more than one meaning (by glossing over which meaning is intended at a particular time).
  • Etymological fallacy – In which reasons that the original or historical meaning of a word or phrase is necessarily similar to its actual present-day meaning. This fallacy can (like many) be inverted to argue that the modern usage for a word is not the same as the historical usage of a word. For example, you very often see Randians or Libertarians trying to argue that the word “welfare” had a different meaning at the time of the drafting of the Constitution than it does now to argue against spending on the “general welfare” as mandated by the Constitution. This is not only an etymological fallacy, it is an outright lie as the contemporaneous Webster’s Dictionary to the drafting of the Constitution defines “welfare” as “Well-doing or well-being; enjoyment of health and the common blessings of life. Syn. – Prosperity; happiness.” (An American Dictionary of the English Language (2 volumes; New York: S. Converse, 1828), by Noah Webster, p. 815) which comports perfectly well with the modern usage of “welfare” in Websters.
  • Fallacy of composition – This is assuming that something true of part of a whole must also be true of the whole.
  • Fallacy of division – assuming that something true of a thing must also be true of all or some of its parts.  This is closely related to the Ecological fallacy where inferences about the nature of specific individuals are based solely upon aggregate statistics collected for the group to which those individuals belong.
  • False dilemma (false dichotomy, fallacy of bifurcation, black-or-white fallacy) – Where two alternative statements are held to be the only possible options, when in reality there are more. You see this a lot from extremists and binary thinkers. This fallacy does violate Aristotle’s Law of the Excluded Middle in “third options” are omitted.  Example:

“Either you’re with us or against us.” – G.W. Bush

This omits a third option, “I’m with the country, but against your unconstitutional tactics and improper selection of targets.”

  • Fallacy of many questions (complex question, fallacy of presupposition, loaded question, plurium interrogationum) – This is when a speaker asks a question that presupposes something that has not been proven or accepted by all the people involved. This fallacy is often used rhetorically, so that the question limits direct replies to those that serve the questioner’s agenda.
  • Fallacy of the single cause (causal oversimplification) – When it is assumed that there is one, simple cause of an outcome when in reality it may have been caused by a number of only jointly sufficient causes.
  • False attribution– When an advocate appeals to an irrelevant, unqualified, unidentified, biased or fabricated source in support of an argument.  This is related to contextomy (the fallacy of quoting out of context), which refers to the selective excerpting of words from their original context in a way that distorts the source’s intended meaning and to the red herring fallacy of appeal to authority discussed below with the other red herring fallacies.
  • Argument to moderation (false compromise, middle ground, fallacy of the mean) – This is assuming that the compromise between two positions is always correct.
  • Historian’s fallacy – This occurs when one assumes that decision makers of the past viewed events from the same perspective and having the same information as those subsequently analyzing the decision. Originalist argments about the meaning of the Constitution are often mired in the Historian’s fallacy.  This is related to, but not to be confused with, presentism which is a mode of historical analysis in which present-day ideas, such as moral standards, are projected into the past.
  • Homunculus fallacy – This where a “middle-man” is used for explanation, this usually leads to regressive middle-man. Explanations without actually explaining the real nature of a function or a process. Instead, it explains the concept in terms of the concept itself, without first defining or explaining the original concept.
  • Incomplete comparison – When not enough information is provided to make a complete comparison. This is a form of lie of omission.
  • Inconsistent comparison – When different methods of comparison are used, leaving one with a false impression of the whole comparison.  The old “apples versus oranges” scenario.
  • Ignoratio elenchi (irrelevant conclusion, missing the point) – When an argument that may in itself be valid, but does not address the issue in question. You see this a lot with threadjacking and attempts to change the subject.
  • Kettle logic – This using multiple inconsistent arguments to defend a position. Sometimes called “crazy talk”, it can be particularly disjointed.
  • Mind projection fallacy – This is when a speaker considers the way he sees the world as the way the world really is, usually in the face of contrary evidence.  You see this a lot with historical revisionists like far right Fundamentalists who want to insist this is a Christian nation despite overwhelming historical evidence that the United States was purposefully created as a secular nation with a secular government.
  • Moving the goalposts (raising the bar) – This is when argument in which evidence presented in response to a specific claim is dismissed and some other (often greater) evidence is demanded.
  • Nirvana fallacy (perfect solution fallacy) – This is when solutions to problems are rejected because they are not perfect. You also see this a lot from Libertarian types who want to argue that because regulation is an imperfect solution then it isn’t a valid solution to white collar crime.
  • Shifting the Burden of Proof – “Onus probandi incumbit ei qui dicit, non ei qui negat” – the burden of proof is on the person who makes the claim, not on the person who denies (or questions the claim). It is a particular case of the “argumentum ad ignorantiam” fallacy, where the burden is shifted on the person defending against the assertion.
  • Post hoc ergo propter hoc is Latin for “after this, therefore because of this” (false cause, coincidental correlation, correlation without causation) – X happened then Y happened; therefore X caused Y.  A variation of correlation is not causation.
  • Proof by verbosity (argumentum verbosium, proof by intimidation) – When submission of others to an argument too complex and verbose to reasonably deal with in all its intimate details. This is related to shotgun argumentation when the arguer offers such a large number of arguments for their position that the opponent can’t possibly respond to all of them.
  • Psychologist’s fallacy – When an observer incorrectly presupposes the objectivity of his own perspective when analyzing a behavioral event. Consider the person who cannot (as opposed to “does not but can”) address certain topics without succumbing to emotion laden argument or histrionic language.  “Think of the children!” is very often not just an appeal to emotion but an example of this fallacy as well.
  • Regression fallacy – This is when ascribes cause where none exists. The flaw is failing to account for natural fluctuations. It is frequently a special kind of the post hoc fallacy.
  • Reification (hypostatization) – A fallacy of ambiguity found when an abstraction (abstract belief or hypothetical construct) is treated as if it were a concrete, real event or physical entity. In other words, it is the error of treating as a “real thing” something which is not a real thing, but merely an idea. For example, arguments that treat good and/or evil as if they were tangible (and usually absolute) properties of the universe instead of concepts defined within the parameters of social context.
  • Retrospective determinism – This is the argument that because some event has occurred, its occurrence must have been inevitable beforehand. This is the cousin of outcome determinism where a desired outcome is reasoned as inevitable based on incomplete, inconsistent and/or cherry picked premises.
  • Special pleading – This is where a proponent of a position attempts to cite something as an exemption to a generally accepted rule or principle without justifying the exemption. This is found at the heart of most Randian arguments where the wealthy are due special treatment like exemption from laws because they “create jobs” (which is utter nonsense according to the evidence).
  • Wrong direction – When cause and effect are reversed, cause is said to be the effect or vice versa or when causation is not clear so it it confused with effect.

Faulty Generalizations

Faulty generalizations – reach a conclusion from weak premises. Unlike fallacies of relevance, in fallacies of defective induction, the premises are related to the conclusions yet only weakly buttress the conclusions.

  • Accident– When an exception to a generalization is ignored.
    • No true Scotsman – When a generalization is made true only when a counterexample is ruled out on shaky grounds.
  • Cherry picking (suppressed evidence, incomplete evidence) – The act of pointing at individual cases or data that seem to confirm a particular position, while ignoring a significant portion of related cases or data that may contradict that position, this tactic was detailed above.
  • False analogy – This is when an argument by analogy in which the analogy is poorly suited. A variation of the “apples versus oranges” scenario.
  • Hasty generalization (fallacy of insufficient statistics, fallacy of insufficient sample, fallacy of the lonely fact, leaping to a conclusion) – This is when a broad conclusion on a small sample. For example:

Some crazy people have been obsessed with movies.

Some crazy people act violently.

Therefore all violent crazy people are obsessed with movies.

  • Misleading vividness – This involves describing an occurrence in vivid detail, even if it is an exceptional occurrence, to convince someone that it is a problem.  Just watch the Zimmerman case as it unfolds and some of the threads on this blog surrounding it and you’ll find plenty of examples of misleading vividness (especially from the “Friends of George”).
  • Overwhelming exception – An accurate generalization that comes with qualifications which eliminate so many cases that what remains is much less impressive than the initial statement might have led one to assume.
  • Thought-terminating cliché – Often a commonly used phrase, sometimes passing as folk wisdom, used to quell cognitive dissonance, conceal lack of thought, change the subject, etc., that is used to attempt to end the debate with a cliché instead of a conclusive logical point and/or evidence.

Red Herring Fallacies

A red herring fallacy is an error in logic where a proposition is, or is intended to be, misleading in order to make irrelevant or false inferences or change the subject. In the general case any logical inference based on fake arguments, intended to replace the lack of real arguments or to replace implicitly the subject of the discussion. There are many informal fallacies that qualify as red herring fallacies.

  • Ad hominem– Using arguments attacking the arguer instead of the argument.  This is often a fallacy, but not always, the exception being when the speaker is making statements without evidence and relying upon their character as proof of veracity of a statement.
  • Poisoning the well – A type of ad hominem where adverse information about a target is presented with the intention of discrediting everything that the target person says.
  • Abusive fallacy – A subtype of “ad hominem” when it turns into name-calling rather than arguing about the originally proposed argument, it can also be a form of transfer or guilt by association.
  • Argumentum ad baculum (appeal to force) – When an argument is made through coercion or threats of force to support position, a.k.a. “Agree with me or I’ll kick your ass.”
  • Argumentum ad populum (appeal to belief, appeal to the majority, appeal to the people) – This is where a proposition is claimed to be true or good solely because many people believe it to be so. Just because a proposition is popular doesn’t automatically mean that it is true or good, but conversely just because a proposition is popular doesn’t mean automatically that it is false or bad.
  • Appeal to equality – When an assertion is deemed true or false based on an assumed pretense of equality. Very often seen the arguments of those in favor of privatization of government services in their assumption that in contracting, all parties are equal because the transaction has the appearance of being voluntary.
  • Appeal to authority– When an assertion is deemed true because of the position or authority of the person asserting it.
  • Appeal to accomplishment – When an assertion is deemed true or false based on the accomplishments of the proposer, i.e. appealing to your own authority.
  • Appeal to consequences (argumentum ad consequentiam) – When the conclusion is supported by a premise that asserts positive or negative consequences from some course of action in an attempt to distract from the initial discussion.  See variations of the “Think of the children!” defense for examples of this fallacy.
  • Appeal to emotion– where an argument is made due to the manipulation of emotions, rather than the use of valid reasoning as detailed above.
  • Appeal to novelty (argumentum ad novitam) – where a proposal is claimed to be superior or better solely because it is new or modern.
  • Appeal to poverty (argumentum ad Lazarum) and appeal to wealth (argumentum ad crumenam) – Supporting a conclusion because the arguer is poor or refuting because the arguer is wealthy and vice versa.
  • Appeal to tradition (argumentum ad antiquitam) – When a conclusion is supported solely because it has long been held to be true. “We’ve always done it this way.”
  • Genetic fallacy – When a conclusion is suggested based solely on something or someone’s origin rather than its current meaning or context.  Again, see Originalist Constitutional arguments to see plenty examples of this kind of fallacy.
  • Naturalistic fallacy (is–ought fallacy) – When claims are about what ought to be instead of on the basis of statements about what is.
  • Reductio ad Hitlerum (playing the Nazi card, Godwin’s law) – Comparing an opponent or their argument to Hitler or Nazism in an attempt to associate a position with one that is universally reviled, this is only a fallacy when you are not actually discussing or arguing traits, behaviors and practices that are not in common with those of the Nazi Party or Hitler.  For example, calling someone a Nazi because they would favor a law telling you what to do is a Godwin’s law violation, but calling someone a Nazi because they favor rounding up segments of society into camps is not.
  • Straw man – When an argument based on misrepresentation of an opponent’s position, this tactic is a favorite of propagandists and trolls everywhere.  For example, see Mitt Romney’s current misrepresentations about Obama’s statements concerning the small business.  Mitt’s camp has taken words out of context to build a straw man and attack it.
  • Tu quoque (“you too”, appeal to hypocrisy) – When the argument states that a certain position is false or wrong and/or should be disregarded because its proponent fails to act consistently in accordance with that position. Just because a smoker tells you smoking is bad and you should quit is not false or wrong simply because they are still a smoker.

Conditional Fallacies

  • Black swan blindness – When the argument ignores low probability, high impact events, thus down playing the role of chance and under-representing known risks, this fallacy is roughly the inverse of the appeal to probability.
  • Broken window fallacy – When an argument which disregards lost opportunity costs (typically non-obvious, difficult to determine or otherwise hidden costs) associated with destroying property of others, or other ways of externalizing costs onto others. For example, an argument that states breaking a window generates income for a window fitter, but disregards the fact that the money spent on the new window cannot now be spent on new shoes.
  • Naturalistic fallacy – When attempts to prove a claim about ethics by appealing to a definition of the term “good” in terms of either one or more claims about natural properties (sometimes also taken to mean the appeal to nature).  This is akin to the fallacy of reification and cousin to the pathetic fallacy (when inanimate objects are imbued with the qualities of living beings).
  • Slippery slope – When asserting that a relatively small first step inevitably leads to a chain of related events culminating in some significant impact, this topic has come up before on this blog.

These are some of the tactics of propaganda you should be conscious of when consuming propaganda.

Note: This column was originally published at Res Ispa Loquitur (jonathanturley.org) on July 29, 2012.  It has been re-edited for presentation here.

About Gene Howington

I write and do other stuff.
This entry was posted in Propaganda. Bookmark the permalink.

19 Responses to Propaganda 104: Magica Verba Est Scientia Et Ars Es

  1. Mike Spindell says:

    Here’s a column I just read that is pertinent to Gene’s post and shows the practiced usage of propaganda today. It begins with:

    “George Orwell famously examined the malleability of political language in the hands of society’s rich and powerful. “Political language,” he wrote, “is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.” Throw in the innovations and conscious attempts to manipulate political language that spew forth from Frank Luntz’s focus groups and the producers at Fox News, and it’s pretty clear that Orwell’s insight has been proven correct with a vengeance.

    It’s an old cliché that “knowledge is power,” but one could also argue that “power is knowledge.” The powerful have the means to define reality on their terms. Ideas that serve power, such as market fundamentalism, are widely ventilated, while those that don’t get relegated to the dustbin. Just look at the climate change “debate” where the fossil fuel industry and its right-wing shills believe they can trump science through propaganda. Few practices illustrate this phenomenon better than how ruling elites these days inside and outside the corporate media talk about “reform.” For example:………”

    http://www.huffingtonpost.com/joseph-a-palermo/maybe-we-need-to-find-a-n_b_4563091.html

  2. Blind Faithiness says:

    Even better the 2nd time around. This time I remembered to bookmark for later. :thumbsup:

  3. Blouise says:

    My favorite of the whole series.

  4. “Either you’re with us or against us.” – G.W. Bush

    I think that Deputy Dubya said something even more fallacious and outrageous:

    “Either you’re with us or you’re for the terrorists.”

    The disputatious dialiectical gambit (as Arthur Schopenhauer would say) worked — and continues working — marvelously on the Democrats and the throroughly corrupt and co-opted corporate media. Some thoughts on all this from George Orwell’s essay “Through a Glass, Rosily” (Tribune, 23 November 1945):

    Whenever A and B are in opposition to one another, anyone who attacks or criticises A is accused of aiding and abetting B. And it is often true, objectively and on a short-term analysis, that he is making things easier for B. Therefore, say the supporters of A, shut up and don’t criticize: or at least criticize “constructively,” which in practice always means favorably. And from this it is only a short step to arguing that the suppression of known facts is the highest duty of a journalist.

    A truth that’s told with bad intent
    Beats all the lies you can invent — William Blake

    The trouble is that if you lie to people, their reaction is all the more violent when the truth leaks out, as it is apt to do in the end.

    The whole argument that one mustn’t speak plainly because it “plays into the hands of” this or that sinister influence is dishonest, in the sense that people only use it when it suits them. … Beneath this argument there always lies the intention to do propaganda for some single sectional interest, and to browbeat critics into silence by telling them that they are “objectively reactionary. It is a tempting maneuver… but it is dishonet. I think one is less likely to use it if one reembemers that the advantages of a lie are always short-lived. … And yet genuine progress can only happen through increasing enlightenment, which means the continuous destruction of myths.

    So again I maintain that you cannot build an ethical, moral, or legal system on the basis of lies. You can certainly build unethical, immoral, or illegal systems on the basis of lies, and mankind has built any number of these for tens of thousands of years, but in the end the truth will out and any system of thought or belief based on lies comes crashing down. The Boy Who Cried “Wolf!” — or “WMD!” — illustrates the case metaphorically and actually.

  5. Logic — like philosophy itself — suffers from a curious oscillation. It is elevated into the supreme and legislative science only to fall into the trivial estate of the keeper of such statements as A is A and the scholastic verses of the syllogistic rules. It claims the power to state the laws of the ultimate structure of the universe, on the ground that it deals with the laws of thought which are the laws according to which Reason has formed the world. Then it limits its pretensions to laws of correct reasoning which is correct even though it leads to no matter of fact, or even to material falsity. It is regarded by the modern objective idealist as the adequate substitute for ancient ontological metaphysics; but others treat it as that branch of rhetoric which teaches proficiency in argumentation.” — John Dewey, Reconstruction in Philosophy (1920).

    As Arthur Schopenhauer pointed out in his essay “The Art of Controversy,” people have long confused logic with dialectic, where originally the former meant dispassionate thinking for its own sake, and the latter meant disputation and the search for victory over an opponent in any argument. As Schopenhauer wrote: “The true conception of Dialectic is the art of intellectual fencing used for the purpose of getting the best of it in a dispute.” Logic, in the modern, scientiic sense, has nothing to do with this branch of rhetoric, as Dewey, who studied logic under Charles Sanders Peirce, has pointed out. As for modern logic, Peirce said:

    “Be it understood, then, that in logic we are to understand the form “If A, then “B” to mean, “Either A is impossible or in every possible case in which it is true, B is true likewise,” or in other words it means “In each possible case, either A is false or B is true.” (Reasoning and the Logic of Things: the Cambridge Conferences Lectures of 1898, edited by Kenneth Laine Ketner, 1992)

    So it pays to keep in mind the critical distinction between dialectics, which the lawyers practice in the interests of their clients, and logic, which scientists and the designers of digital computers practice.

  6. Unfortunately for the modern world, the influence of Aristotle’s ideas about logic still muddy the intellectual landscape to a considerable degree. As Bertrand Russell wrote in A History of Western Philosophy (1945):

    “I conclude that the Aristotelian doctrines … are wholly false, with the exception of the formal theory of the syllogism, which is unimportant. Any person of the present day who wishes to learn logic will be wasting his time if he reads Aristotle or any of his disciples. … Unfortunately, they appeared at the end of the creative period of Greek thought, and therefore came to be accepted as authoritative. By the time that logical originality revived, a reign of two thousand years had made Aristotle very difficult to dethrone. Throughout modern times, practically every advance in science, in logic, or in philosophy has had to be made in the teeth of the opposition from Aristotle’s disciples.”

    So to the extent that one bases his or her notions of logic upon the teachings of Aristotle or the Medieval Catholic Church (but I repeat myself), one thereby goes astray from the modern scientific world that Peirce, Russell, and Dewey — among so many others — helped to illuminate for our intellectual benefit.

  7. From the Preface to Formal Logic: a Scientific and Social Problem, by F. C. S, Schiller (1912):

    “For over two thousand years Formal Logic has been a stock subject of academic instruction. It has been established and endowed with a multitude of official defenders chosen from the ablest and acutest intelligences the human race has produced. Its subject-matter, moreover, is so far from being recondite that it should be familiar to every rational being. It professes to study an operation everyone professes to perform habitually, viz., thinking, and to explain how we ought to think. It might be supposed, therefore, that by this time the subject of Logic was completely explored, that every embellishment of technicality had been added, and every logical question settled beyond a shadow of a doubt.

    Instead of this, what do we find? Not only that ordinary human thinking continues to pay scant respect to Logic, but that the logicians themselves continue to differ widely as to the nature, the function, the value, and even the existence of their science. Nor has Formal Logic, despite its establishment, ever quite been able to silence the voice of the critic. Of late criticisms have so multiplied in number and increased in severity, and that among the very professionals who seemed pledged to uphold the doctrines on which their dignity and livelihood depended, that it is hard to see how a study which labors under such imputations can be called scientific.To these criticisms Formal logicians have hardly attempted a reply. Strong in the consciousness that they were beati possidentes, and that their subject, though it might be nonsense, was at any rate consecrated by a tradition of 2,000 years, and that the history of education proves that nothing has a greater hold on the human mind than nonsense fortified by technicality, because the more nonsensical it is the more impervious it becomes to rational objection, the more impossible it is to amend it, and so the better it lasts [emphasis added], they have trusted that their traditional scheme of instruction would weather this strom, as had survived the revolt of renascent literature against Medieval Scholasticism and the nineteenth century revolt of science against dogma and tradition.”

    Gene H. has provided an interesting and useful compendium of dialectical tricks and dodges upon which the practice of law and the propagation of propaganda depend. Logic, on the other hand, involves the inductive search for necessary and sufficient causal relationships which can lead to accurate prediction in contrast to dialectics which can only offer a posteriori — after the fact — rationalization of the unexplained but self-interested.

  8. “At present, nearly all logicians are (alas!) dialecticians.” — F. C. S. Schiller, Formal Logic: a Scientific and Social Problem (1912)

    Unfortunately, very much true as of 2014. Spin in your graves, Arthur Schopenhaer, Charles Sanders Pierce, Bertrand Russell, John Dewey, and F. C. S. Schiller. Aristotle’s “examinable nonsense” continues to mesmerize and bedevil the educational, legal, political, and journalistic establishment (note the singular, compound noun) in the United States, if not the English speaking world. Progress does continue in scientific and technical fields where logic has freed itself from dialectics, but in too many other areas of human thinking, enlightenment — or “the destruction of myths” (as Orwell put it) — remains frightengly remote, if even conceivable.

  9. Charlton Stanley says:

    Noted psychologist and ethicist Dr. Ken Pope has an excellent discussion of how these logical fallacies apply to psychology. This link takes you his page on logical fallacies in psychology, ethical fallacies in psychology, ethical pitfalls, and fallacies regarding psychological assessment. Certainly a link worth bookmarking for anyone interested in fallacious thinking and rationalization.
    http://kspope.com/fallacies/index.php

  10. Gene H. says:

    MM,
    I submit that the understanding of propaganda as disseminated is all done after the fact. That defuses and mitigates the damage. Using logic to discover predicate causation can be either predictive or regressive, but its predictive value is always constrained by uncertainty whereas regression is less so because you are moving backward in time. The point of this series isn’t causation any further than asking “what does the message benefit the speaker?” To me, causal analysis for the sake of finding prime causes is a separate and broader subject in itself.

  11. (1) “I submit that the understanding of propaganda as disseminated is all done after the fact.”

    (2) “To me, causal analysis for the sake of finding prime causes is a separate and broader subject in itself.”

    Two interesting statements, Gene, which I will reflect upon, giving them the serious attention they deserve. To begin with, let us consider a practicing propagandist at work: namely, Dr. Frank Luntz as inteveriwed in the PBS Frontline series “The Persuaders.”

    “You think emotions are more revelatory than the intellect for predicting [human] decisions?”

    “80 percent of our life is emotion, and only 20 percent is intellect. I am much more interested in how you feel than how you think. I can change how you think, but how you feel is something deeper and stronger, and it’s something that’s inside you. How you think is on the outside, how you feel is on the inside, so that’s what I need to understand.”

    And from another practicing propagandist — i.e., mass-marketer — Clotaire Rapaille:

    “I don’t care what you’re going to tell me intellectually. I don’t care. Give me the reptilian. Why? Because the reptilian always wins.”

    Notice that these two practicing propagandists discount a posteriori intellectual understanding and focus instead on research into the primal, subconsicous emotions that cause predictable (within a specified statistical margin of error) human decisions. These professional propagandists do not simply ape, or copy, what they have seen other propagandists do, but continually seek those deeper, hidden motivations which they might exploit to maximum effect. So, if one can believe what the people say who make a [very good] living at this sort of thing, “causal analysis for the sake of finding prime causes” constitutes not a “separate and broader subject,” but the fundamenal, essential one. In this, they proceed quite scientifically. They observe; they hypothesize; they experiment to test their hypotheses for predictive value; then they design and deploy their propaganda (i.e., sales) campaigns with often very substantial results — from the point of view of their paying clients. That their victims might feel intellectually rueful or philosophical after noticing that someone has robbed them does not concern these practicing propagandists as long as their victims (1) cannot identify the culprits, (2) do not look iinto the psychological, causal basis that made the robbery predictably successful, and (3) do not agitate for political and legal regulations that would make such rhetorical larceny more difficult and, therefore, rare in the future.

    To recapitulate, then, those who actually do propaganda focus on discovering why and how it works, and upon whom, before they dissminate it. What a few intelligent people think of propaganda after it has hoodwinked a substantial proportion of the population matters somewhat to the propagandists, but since they understand and encourage mass amnesia, they can live with the few, marginalized intellectual malcontents who notice, discusss, and complain about propaganda but never seem do anything to make it less effective.

    I realize and appreciate your — rather modest — stated aims for this propaganda series. Nonetheless, you’ve already published this material on Jonathan Turley’s blog and have only edited it for reprinting here. For my part, I would wish for the expansion and development of this subject leading to the formulation of counter-strategies and tactics based on discoverable causal relationships in human affairs. This will entail, in my opinion, the debunking of received dogma that confuses the logic of discovery with the dialectics of disputation. You have provided a good deal of useful iniformaton about the latter. Congratulations on that. But now we need to catch up with the proponents of propaganda who scientifically seek for the causes of human behavior and do not content themselves with a detatched commentary about it from afar. We have noticed and commented. Now what can we do?

  12. Gene H. says:

    MM,

    You apparently don’t miss the obvious, but you do dismiss it. These articles aren’t geared toward those who “do propaganda”. They are geared toward innoculating those who are exposed to propaganda on a daily basis. They are designed to make people ask the right questions to sort propaganda trying to get something from them from any discrete information that may have other value.

    One does not train a martial art by teaching the student to kill on the first day. Starting off this series by telling people how to create propaganda rather than recognize and defend against it? Would have been irresponsible.

    As for more material on the subject? Yeah, that will be forthcoming at a later date. So maybe you should resist the urge to “read ahead” (even if it is based on your personal experience as a creator of propaganda).

  13. Pingback: Propaganda 106 – Waging War (A Case Study) | Flowers For Socrates

  14. Charlton Stanley says:

    Gene,
    Last night my daughter and I were watching a couple of TED talks. This morning, re-reading this, it occurs to me the TED people are missing a good bet by not having you on to present this stuff.
    Um….some wheels started turning in my head. I will have to think about this. More later.

  15. Pingback: Propaganda 200: In Summation, Gist The Whitewashing Power Of Editing | Flowers For Socrates

  16. Pingback: Propaganda 102 Supplemental: Holly Would “American Sniper” | Flowers For Socrates

  17. Pingback: Propaganda 102 Supplemental: Holly Would An “American Sniper” | Flowers For Socrates

  18. Pingback: Propaganda 104 Supplemental: Vizzini and “Terrorism” (Case Study) | Flowers For Socrates

  19. Pingback: What is a Democratic Socialist? | Flowers For Socrates

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s