rexresearch.com

Propaganda ~ Logical Fallacies


(1)  Allen & Greene: The Propaganda Game
(2)  Yoder: Fallacy Zoo
(3)  US Army: Psychological Operations Field Manual No. 33-1
(4)  Sweeney: Twenty-Five Ways To Suppress Truth: The Rules of Disinformation
(5)  Sweeney: 8 Traits of the Disinformationalist
(6)  References



Robert Allen & Lorne Greene: The Propaganda Game

[Excerpts]

Based on the book Straighter Thinking by George H. Moulds ~ Published in 1966 by AIM (Autelic Instructional Materials) Publishers, New Haven, CT

Contents

I. Introduction

II. Instructions [Not included here]

III. Explanations of Techniques

A. Techniques of Self-Deception
1. Prejudice
2. Academic Detachment
3.Drawing the Line
4. Not Drawing the Line
5. Conservatism, Radicalism, Moderatism
6. Rationalization
7. Wishful Thinking
8. Tabloid Thinking
9. Causal Oversimplification
10. Inconceivability
B. Techniques of Language
1. Emotional Terms
2. Metaphor & Simile
3. Emphasis
4. Quotation Out of Context
5. Abstract Terms
6. Vagueness
7. Ambiguity
8. Shift of Meaning
C. Techniques of Irrelevance
1. Appearance
2. Manner
3. Degrees & Titles
4. Numbers
5. Status
6. Repetition
7. Slogans
8. Technical Jargon
9. Sophistical Formula
D. Techniques of Exploitation
1. Appeal to Pity
2. Appeal to Flattery
3. Appeal to Ridicule
4. Appeal to Prestige
5. Appeal to Prejudice
6. Bargain Appeal
7. Folksy Appeal
8. Join the Bandwagon Appeal
9. Appeal to Practical Consequences
10. Passing from the Acceptable to the Dubious
E. Techniques of Form
1. Concurrency
2. Post Hoc
3. Selected Instances
4. Hasty Generalization
5. Faulty Analog
6. Composition
7. Division
8. Non Sequitur
F. Techniques of Maneuver
1. Diversion
2. Disproving a Minor Point
3. Ad Hominem
4. Appeal to Ignorance
5. Leading Question
6. Complex Question
7. Inconsequent Argument
8. Attacking a Straw Man
9. Victory by Definition
10. Begging the Question
IV. The Experts Game [Not included here]

V. Summary

VI. Suggested Answers [Not included here]

VII. Appendix [Not included here]


I. Introduction ~

Propaganda is a subject of great concern in our society today, perhaps more so than in any other society in history. With the advent of television as a complement to the other communications media now available to us, the opportunities to use propaganda in disseminating information, expounding ideas, and offering opinions have increased considerably. And, unfortunately, it is far too often the case that propaganda is used to make us accept questionable points-of-view, to make us vote for men who may be unfit for public office, and make us buy products which are useless and sometimes even dangerous. Therefore, propaganda, or the method of influencing people to believe certain ideas and to follow certain courses of action, is of special importance to each of us.

The word "propaganda" comes from the Latin phrase "Congregatio de Propaganda Fide", or "Congregation for the Propagation of the Faith", a committee formed early in the Roman Catholic Church, whose function it is to aid the propagation or spread of the church doctrine throughout the world. Propaganda plays a dynamic, positive role in the daily lives of many men. Actors, preachers, teachers, politicians, editors, advertisers, salesmen, reformers, authors, parents --- our friends and even ourselves --- practice the art of persuasion. And each of us, as we attempt to put our ideas across to others, to persuade them to agree with our way of thinking, is, in a sense, acting in the ancient Roman tradition of the word: we are all missionaries for our causes.

Propaganda, as we know it today, can be nefarious as well as a noble art. For at one moment its techniques can be used to whip up racial hatred among groups of people; at another moment, its methods can be employed to move persons to acts of warmth and kindness. It is important, therefore, that we consider a person’s motive for using a propaganda technique, as well as understanding that a technique has been used.

Often, the ideas of facts that we wish to convey are linked with words about which everyone has some emotional feeling --- words such as "mother", "home", "beauty", "love", or "cruelty", "murder" or "death" --- since both hostile and loving emotions are a part of us all. But just as there is a place for emotional feeling in men, so also there is a place for more dispassionate thinking. In a democratic society, it is the role for every citizen to make decisions after evaluating many ideas. It is especially important then that a citizen be able to think clearly about the ideas that are daily presented to him. It is imperative that he be able to analyze and distinguish between the emotional aura surrounding the ideas, and the actual content of the idea. To this goal of clear thinking the game of PROPAGANDA addresses itself.

PROPAGANDA has been designed to introduce the players to some of the techniques used to distort the thinking process. However, one should not be deceived into thinking that familiarity with the subject matter in this game qualifies him as an expert thinker. PROPAGANDA should be regarded as an introduction to, rather than a completed course in, clear thinking.

A number of cautions need to be observed as one gains a better understanding of propaganda techniques. Many times defects in argument occur innocently. This is particularly true in discussions involving families, associates, and/or close friends. Although it is hoped that your awareness of the principles and practices of propaganda will be employed in your everyday approach to problem analysis, it is recommended that in you "go slow" in correcting others. No one likes to be branded publicly as an illogical fool. Also, just because a labeled technique can be attached to an argument, that argument is not necessarily invalid. Finally, it is not the aim of the authors that the PROPAGANDA GAME encourage youngsters and adults to become cynical and unduly suspicious of everything that is said and written, but rather that they become aware of the emotional overtones in all arguments and suggestions, and thus gain more thoughtful control over their responses to the multitude of ideas that they encounter daily...

II. Instructions ~ [Not included here]

III. Explanation of Techniques ~

Section A: Habits of Reflective Procedure  (Techniques of Self-Deception)

1. Prejudice ~

Example: Nathanael asked (referring to Jesus): "Can anything good come out of Nazareth?", and thus indicated his prejudice against Jesus’ hometown.

Meaning: A prejudice is an unwillingness to examine fairly the evidence and reasoning in behalf of the person or thing which is the object of the prejudice. It is a prejudgment caused by indoctrination, conditioning, or some prior experience of a singularly pleasant or unpleasant character. A prejudice has strong and deep emotional support.

In discussing Prejudice here we are not talking of appeals to known prejudices. These are made from without, as by an advertising man, a salesman, or a politician. Rather, our interest is in how your own Prejudice, unaided by outside support, victimizes you.

Prejudice differs from hasty Generalization in that although hasty Generalization often represents a spontaneous emotional reaction, Prejudice is always a matter of much longer standing. The feeling that operates in the latter case is deep, not superficial, and is often completely hidden from the man in its grip.

2. Academic Detachment ~

Example: "I’ve heard many arguments in favor of the Republican candidate and just as many for the Democratic. Hence I don’t find any reason to prefer one over the other, so I’m going to stay home and not vote for either one".

Meaning: We refuse to commit ourselves when decision or action is demanded. In a situation requiring a stand to be taken, we see (or think we see) persuasive arguments on both sides. But certain situations (e.g., voting) require decision and action of one kind or another. Here, instead of trying to remain neutral, we must make a decision on the basis of which side seems to have the greater weight of evidence.

3. Drawing the Line ~

Example: "Either you tell the truth or you lie".

Meaning: Sharp distinctions are drawn where it is inappropriate to draw sharp distinctions.

It is permissible to draw the line between those who are for you and those who are not for you, those who tell the truth and those who do not tell the truth, and so on. But the error and inclination exhibited by common speech is to fail to realize that the logical class of those who do not tell the truth includes two subclasses that are quite different: (1) those who lie and (2) those who say nothing at all.

4. Not Drawing The Line ~

Example: "If we are allowed to stay out till two o’clock in the morning, why not till three --- one hour doesn’t make that much difference".

Meaning: The existence of differences is denied just because the differences are small and therefore apparently unimportant.

5. Conservatism, Radicalism, Moderatism ~

Example: (1) "This belief is an old one, but I want you to know that the old ways are the best ways".

(2) "What we need is new ideas, completely new ways of thinking; the old is not worthy of our acceptance".

(3) "Vote for me. My program is neither conservative nor radical".

Meaning: These three habits of mind are forms of prejudice. But they are not necessarily such. Prejudices have histories with a beginning. But the neo-conservative, the one who prefers what is old or familiar simply because it is old or familiar, may be born such; it is part of the temperament he brings into the world. Radicalism is the habit of preferring the new or the revolutionary just because of its newness. The moderate habitually chooses middle-of-the-road or compromise ground; he avoids the two extremes. But there is no inherent virtue in moderatism or compromise as such. Actually, there are times when our position should be conservative, and still other times when we should be moderate.

6. Rationalization ~

Example: The student, having failed the test, blames his failure on the classroom’s being so hot that he couldn’t think, whereas in reality he knows that he didn’t spend enough time in study.

Meaning: You cite reasons or causes that will justify action that really has less creditable grounds.

7. Wishful Thinking ~

Example: "My son will win because he ought to win after all his long hard preparation".

Meaning: You believe a proposition to be true because you want it to be true.

When we are forced to admit that our wishes have not become reality, we may then seek comfort in rationalizing. If, in the example cited above, the son does not win and the contest is fair, the parent will feel the necessity of inventing some argument that will excuse the son’s failure.

8. Tabloid Thinking ~

Example: "In college Basil was taught all about evolution --- the apeman theory, you know".

Meaning: To think in tabloids is to oversimplify a complex theory or set of circumstances. The tabloid thinker prefers quick summaries and has the habit of "putting things in a nutshell".

Tabloids concerning people are popular because they offer a neat summary of the character of a prominent person. "Marx? You don’t know who Marx was? Why, he was that philosopher who became impatient and irritable in his old age". It is much easier to remember Marx in this simple fashion than to remember him as a man of many interesting and controversial facets of character and conviction. These human tabloids are frequently emotional, but they are not mere Emotional Terms. To be Tabloid Thinking there must be some indication that someone is trying to sum up another’s character. All stereotypes ("barbers are talkative") are tabloids because they present a certain trait or characteristic, which is really superficial or trivial, as being the essential nature of a given class.

9. Causal Oversimplification ~

Example: "If it were not for the ammunition makers, we would never have wars".

Meaning: A complex event is explained by references to only one or two probable causes whereas many are responsible.

10. Inconceivability ~

Example: "Since Ballhead State has never in its past history won the conference title, I just can’t picture them winning it this year".

Meaning: You declare a proposition to be false simply because you cannot conceive it actualized or possible of realization.

Section B: Watch Their Language --- And Yours Too (Techniques of Language)

1. Emotional Terms ~

Example: Participant in Argument: "If you ignorant fools would only shut your traps a while and let me explain".

Meaning: An emotional term is a word or phrase  which, however much factual information it conveys about an object, also expresses and/or arouses a feeling for or against that object. Translated into neutral language the emotionally-charged example given above should read: "I don’t agree and if you’ll just give me a chance to talk, I’ll show you why".

The authors believe that emotional language is appropriate in non-controversial situations. For purposes of the Propaganda game, patriotic celebrations, church services, poetry and other literary forms, and whenever a person is expressing personal feelings without attempting to persuade or convince others are considered to be non-controversial situations.

2. Metaphor & Simile ~

Example: Metaphor --- "Napoleon was like a fox".

Meaning: A metaphor is a comparison implied but not definitely stated. In the case of simile the comparison is explicitly stated by means of such words as "like" or "as".

In controversial situations the employment of metaphor or simile is to be avoided because such figures of speech are apt to suggest likenesses not really intended or not actually present. Napoleon was not actually a fox. He may have been like one, but if so, was it with respect to shrewdness or thievery or both or neither?

3. Emphasis ~

Example: When "We should not speak ill of our friends" is quoted, the original meaning changes if any of the following underlined words is emphasized: "We should not speak ill of our friends". Emphasizing "we" suggests that we should not, true, but others may.

Meaning: The technique of emphasis occurs only when another speaker or writer is quoted and one or more words emphasized so as to imply what would not otherwise be implied and thus put into the mouth of the source, meanings he may not have wished to convey.

Oral emphasis is usually secured by means of pitch, tone, or volume of voice. Written emphasis is secured by a variety of devices, such as italicizing and underlining. "Italics mine" (or its equivalent) is the accepted way for a writer to indicate that he is giving a stress to certain words that the original author had perhaps no intention of stressing.

4. Quotation Out of Context ~

Example: Someone quotes the Bible as saying that, "money is the root of all evil", but leave out the preceding words, "the love of".

Meaning: Quotation out of context is a propaganda technique wheb the effect of quoting a given statement without its context is to distort the original meaning in context.

The context of a given statement is not merely the words that precede and that follow but every accompanying circumstance, whether it be time and place or gesture and facial expression.

5. Abstract Terms ~

Example: A speaker defines "neurosis" as "a psychological term for a state of mind involving the nerves", but when he is asked to identify or point to --- among a large number of people --- a case of neurosis, he is at a loss to do so, showing that he is unable to use the term to make any concrete distinctions.

Meaning: An abstract term is a word or symbol which stands for the qualities (one or more) possessed in common by a number of particular things, facts or events. The technique of abstract terms occurs when an arguer employs a word for which he may have meaning in the form of other words, but the arguer is unable to identify the concrete facts to which to word supposedly refers.

6. Vagueness ~

Example: Someone says to me, "Sit down on that stool", and I sit down on the thing he points to. His meaning is not ambiguous; I understand what he is referring to. But I find the term "stool" vague under the circumstances, and I protest, "But this is not a stool, for it has a little back to it, and so it is a chair". He may reply, "But there is really not enough back there to call it a ‘back’, so I call it a ‘stool’".

Meaning: To call a word "vague" is to say that marginal situations can and do arise where there is doubt as to whether the word should or should not be used in describing those particular situations. The technique of vagueness exists where there is uncertainty as to the scope of the word.

7. Ambiguity ~

Example: Joe says, "Henry likes pudding better than his wife". And one or more people hearing him are left wondering whether Henry likes pudding better than he likes his wife if Henry likes pudding more than his wife does.

Meaning: A word or phrase is ambiguous if in the mind of a hearer or reader it has two or more quite different meanings and the interpreter is uncertain as to which was really meant. In argument such a situation would at all times be undesirable.

8. Shift of Meaning ~

Example: "The fellow who was supposed to arbitrate decided in favor of a company and fined the union. Now anyone who takes sides in a dispute is certainly not impartial. So how can this fellow claim to be an impartial arbitrator?".

Meaning: In shift of meaning a word appears explicitly or implicitly two or more times in an argument but with different meanings.

In the example appearing above "impartial" shifts meaning. In its first use it means "wholly refraining from judgment; taking no stand on an issue". But in its second use it means "judging after investigation but without previous bias". Obviously, the arbitrator’s being impartial in the second sense does not necessitate his being so in the first sense. The implied conclusion ("the arbitrator is not impartial") is invalid.

Section C: How Suggestible Are You? (Techniques of Irrelevance)

1. Appearance ~

Example: A floor wax nationally advertised on television is shown in the commercial being applied to a floor with the immediate result of a brilliant luster. The viewer does not know that the floor has been buffed and polished for days, and then dust coated just before the wax was applied in the commercial.

Meaning: The appearance of a thing (or person) is made the basis of our acceptance or rejection without any thought that this appearance may be a deceptive indicator of value.

2. Manner ~

Example: "He was such a well-behaved man, so understanding, so sincerely helpful. He wanted to help us. I couldn’t insult him. So I gave him our savings to invest. He seemed so trustworthy".

Meaning: A person’s manner of behaving is made the basis of our acceptance or rejection of him without any thought that this manner may be a deceptive indicator of value.

3. Degrees & Titles ~

Example: The name on the office door reads "James A. Rydack, The. B, M. Th. R., As. D., Counselor Extraordinary of the Society of Metaphysicians". A woman about to enter the office says to her husband, "With all those degrees and that title, he must know his stuff".

Meaning: We buy or we believe out of respect for degrees or titles attached to the names of those who persuade us.

4. Numbers ~

Example: From an advertisement: "One million more sold this year than last".

Meaning: We buy or believe because of the large numbers associated with the product or proposition.

5. Status ~

Example: Advertisement appearing in the Hampshire Gazette, January 29, 1970: "President Washington, when he addressed the two houses of Congress on the 8th instance, was dressed in a crow-colored suit of American manufacture. This elegant fabric was made from the manufactory in Hartford".

Meaning: Persons or objects for which we have a strong sentiment of respect or esteem -- or which at least possess some degree of fame or prestige -- are introduced into the argument as endorsing that which we are asked to buy or believe.

6. Repetition ~

Example: Radio commercial: "Get up with GET-UP, GET-UP’s got get up. Got it? Get it? Get GET-UP!".

Meaning: We buy or believe because we have heard or seen the idea or product name so often.

7. Slogans ~

Example: "Wheatless, the breakfast of champions"; "LSMFT" (Lusty Strife means Fine Tobacco); "When better cars are built, Bluink will build them"; "Better buy Bards-Eye".

Meaning: A slogan is a short, meaningful, catchy phrase or sentence intended for general consumption and designed to terminate thought and promote action in favor of the slogan maker. However true the slogan may be, if your action is merely a favorable response to the slogan, the technique is successful.

8. Technical Jargon ~

Example: Advertisement: "Liberty Rubber’s new tires contain Durium, the bonding material that makes these tires wear for years".

Meaning: The technique of technical jargon is the use of technical language or unfamiliar words, whether contained in the dictionary or freshly coined, for the purpose of impressing people.

9. Sophistical Formula ~

Example: Mrs Jones: "You know, Ann, I think the Browns must be having trouble. The last two mornings I’ve seen Tom Brown leave the house, slam the door, and drive off in his car looking awfully mad. I’ll bet they’re headed for a divorce".

Mrs Smith: "I don’t know, Barbara. Rally, they’ve always seemed to be very much in love".

Meaning: To shut off or close the argument a popular maxim or old saying is quoted. But every controversial situation must be settled in its own terms, and not on the merits (if any) of some proverb.

Section D: What’s Your Weakness? (Techniques of Exploitation)

1. Appeal to Pity ~

Example: Student to professor: "I know that my test grades have been poor and that I deserve an F, but my father is in the hospital and it will just break his heart if I get an F in this course".

Meaning: An attempt is made to secure our commitment by presenting the object of commitment as an object of sympathy, thereby arousing our sympathetic feelings to the point where these feelings determine favorable action.

2. Appeal to Flattery ~

Example: Salesman to young matron answering the door: "Is your mother home?".

Meaning: An attempt is made to persuade us to buy or believe by flattering us on our personal appearance or in some other category where we excel or desire to excel.

3. Appeal to Ridicule ~

Example: The sergeant, on the first day of class, having made a certain statement is asked an embarrassing question by a member of the class. Preferring a cheap victory to an honest discussion, the sergeant replies sarcastically, "I am afraid, Private Jones, that I cannot understand what you mean. You are too deep for me". He then goes on to the next questioner.

Meaning: An attempt is made to influence us to accept a certain proposition by poking fun at those who oppose the proposition.

4. Appeal to Prestige ~

Example: Real estate advertisement: "Live in exclusive Broadmoor Terraces, where successful people live. Deluxe executive apartments furnished in the Continental manner".

Meaning: An attempt is made to induce you to buy or believe by stating or suggesting that such action will secure or maintain prestige for you.

Status and Appeal to Prestige, though related techniques, nevertheless represent quite different errors. In the former case it is suggested that if Jones, a person possessing or allegedly possessing status, buys or believes, so should you. There is no implication that your buying or believeing will confer on you equivalent status. The Appeal to Prestige suggests that you should buy or believe because by so doing you will acquire or improve status.

5. Appeal to Prejudice ~

Example: A young man, wishing to make a good impression on his girl friend’s father, learns that he is a rabid Democrat. So one evening, while waiting for the daughter to finish dressing, he engages the father in conversation, and the young man turns the conversation to the point where he can rip to Republicans to pieces. The father later informs the girl that the young man has "good stuff in him and should go a long way".

Meaning: The one who makes the appeal to prejudice attempts to persuade you to act or feel in a certain way by associating his person, product or proposal with a certain one or more of your prejudices, positive or negative --- a prejudice being a prejudgment wrapped in emotion and having a history. Not only does he rekindle your prejudice, he also arouses in you warm feelings toward the one (himself) who apparently shares your prejudice. And so it becomes much easier to make you believe or buy what ever he has to offer.

6. Bargain Appeal ~

Example: The supermarket has a special display at the front of the store: canned peaches by the case (8 cans) for "only $3.20". Checking the shelves where single cans of peaches may be purchased, one finds the same brand priced at 40 cents per can.

Meaning: An attempt is made to get you to buy by appealing to your desire to save money. If you buy without making you own comparison as to price, quality, and service, the technique is successful.

7. Folksy Appeal ~

Example: The salesman who on meeting the prospect for the second (or even the first) time slaps him on the back as if he were a long lost brother and addresses him by his nickname.

Meaning: The user of this device places himself or his product on a level of neighborly intimacy with the reader or listener.

8. Join the Bandwagon Appeal ~

Example: "Vote for a winner, Senator Simpkins".

Meaning: An effort is made to influence you to act in a certain way by asserting or implying that that is what is popular or what is the majority is doing.

9. Appeal to Practical Consequences ~

Example: Slip inserted in workers’ pay envelopes: "If the Republicans do not win this election, this factory will be forced to close its doors and you will be without a job".

Meaning: An effort is made to persuade us to buy or believe by appealing to our concern for our own individual welfare, i.e., if we do as we are asked, we will secure certain beneficial consequences, while if we refuse to do as asked, the consequences will be harmful.

10. Passing from the Acceptable to the Dubious ~

Example: Advertisement: "The boys in the service abroad want letters more than gifts. Write frequently because some letters may be lost. Write only good news because there are enough unpleasant things going on over there. Buy and write on Barton’s Victory Stationery".

Meaning: The arguer states a series of propositions. The early ones are readily acceptable to the audience or reader, but the concluding statement may be dubious. The listener or reader is expected to accept blindly the later ones because he has accepted those which came before.

Section E: The Fault May Be With The Form (Techniques of Form)

1. Concurrency ~

Example: "Who was president at the time of World War I? Wilson, a Democrat. Who was President at the time of World War II? Roosevelt, a Democrat. Who was President at the time of the Korean War? Truman, a Democrat. Obviously, the Democratic party is the war party".

Meaning: Because things exist or appear simultaneously, it is claimed that one is the cause of the other. The form of the argument is: A is present along with B; therefore A is the cause of B. But two concurrents could never be the cause of one another, for a cause is something antecedent in time.

2. Post Hoc ~

Example: "The bankers are the source of our troubles. You will notice that every depression is preceded by bank failures".

Meaning: Because two events (or things) follow one another in close temporal succession the first event is claimed to be the cause of the second. The form of the argument is: A precedes B; therefore A is the cause of B. We may take as a hypothesis for testing, that A is a (or the) cause of B, but we should not forget that any one of a score of other preceding events is equally worthy of consideration.

3. Selected Instances ~

Example: Someone says, "All professors are conceited". When asked for his evidence he replies, "Well, how about Professor Smith, Professor Jones, and Professor Brown. Everybody knows they’re as conceited as they come". But he deliberately skips over Professor Black whom he knows to be a model of humility.

Meaning: Support is drawn for a position by choosing only those cases or instances which can back it up and disregarding those cases or instances which either contradict or do not support the position. The form of the argument is: All A is B; because A1, A2, A3 and A4 are B. the form is invalid; the arguer knows that at least A5 is not B.

4. Hasty Generalizations ~

Example: Having observed five women to be poor drivers, Jones generalizes and declares all women are poor drivers.

Meaning: The arguer jumps to a general or blanket conclusion about members of a given group on the basis of an unrepresentative or insufficient number of cases. The form of the argument is: A1, A2, A3 are B; therefore all A is B.

Selected Instances and Hasty Generalization have much the same effect. There are important differences, however. Hasty Generalization typically occurs on an emotional basis, while selected instances is typically coldly calculating. In the former case there is, at the time at least, no awareness of opposed instances; in the latter case, there is. Selected Instances is not merely crooked thinking but dishonesty. On the surface the two are apt to look alike, and until we have evidence that the arguer is really deliberately closing his eyes to contradictory cases, we cannot label the technique as Selected Instances.

5. Faulty Analogy ~

Example: "Last quarter I had a student by the name of Orzymski who did good work. This quarter I have another student by that name, and I’m expecting good work from him".

Meaning: To reason analogically is to reason that because two of more things or types of things are alike in some one or more respects (we may call this the antecedent resemblance), they will therefore be found alike in some other respect(s) --- the consequent resemblance. In cases of reliable analogies the antecedent factor is already known to have some bearing on the consequent factor. In faulty analogies such knowledge is lacking. The form of the argument is: A is like B in respect c; therefore A is like B in respect d.

In our example, while it is true that Orzymski is a rare name in English-speaking societies and while it is even probable that a second Orzymski enrolled at the same college would be related to the first, we need evidence that heredity is a decisive factor in scholastic performance. But an analogy is no stronger than its linking generalization, which in this case is "Heredity determines scholastic performance". Since our experience contains an abundance of cases of relatives with widely different scholastic records, we can have no confidence in an analogy based on such a linking generalization.

Some arguments take the form of alleging a complete analogy: two things are alike to the point of identity. The argument is: A (or all A) is c and B (or all B) is c; therefore B is A (or A is B). "Communists will not take the oath of allegiance and neither will Jones. Therefore he must be a Communist". The absurdity of this argument becomes readily evident when we see it is just like saying, "Dogs have tails; this cat has a tail; so this cat is a dog".

In discussing Metaphor and Simile the point was made that neither one, especially Metaphor, should be used in controversial situations. That remains true. But a metaphor or simile appearing by itself is to an argument, and is very uncertain in meaning. Analogies make use of simile and make clear how A is compared to B, but it still must be said that analogical argument is strong only when A and B are essentially the same thing, and A has a property deriving from its essential nature, therefore B must have the same property.

6. Composition ~

Example: "He’s a nice boy; she’s a nice girl. I’m sure they’ll make a nice married couple".

Meaning: We reason as if the properties of elements or individuals were always (i.e., necessarily) the properties of the wholes which they constitute. But the assumption that what holds true of a part is automatically true of the whole cannot be justified. The form of the argument is: A is part of B and A is c; therefore B is c.

7. Division ~

Example: "How dare you criticize any member of the Harvard faculty? Don’t you know that this faculty has the highest reputation of any university faculty in the United States?".

Meaning: We reason as if the properties of any whole are always (i.e., necessarily) properties of each part. But the assumption that what holds true of a whole is automatically true of its parts cannot be justified. The form of the argument is: A is part of B and B is c; therefore A is c.

8. Non Sequitur ~

Example: "Your children deserve the best milk. Buy Lorden’s".

Meaning: The conclusion is not necessitated by the premise(s).

Strictly speaking, all the techniques so far covered where the conclusion is invalid are Non Sequiturs. There is, therefore, no one form for a Non Sequitur. In the example cited above no more reason is given to buy Lorden’s milk than to buy Healtest or any one of a hundred other brands of milk.

Since the Non Sequitur label can be applied to so many other techniques, the label will be reserved on for those invalidities that cannot be classified under some other heading. They are, at least, Non Sequiturs.

Section F: Tricks of Argument (Techniques of Maneuver)

1. Diversion ~

Example: Jones: "I think that American industry should be run on a profit-sharing basis".
Smith: "Really! I don’t think so. I don’t see any obligation on the part of owners to share profits with their employees".
Jones: "Profit-sharing will provide the worker with greater incentive".
Smith: "Workers don’t need more incentive. They need higher wages. I remember the wages I got as a boy, working in the bean fields. They were pitifully low".
Jones: "Yes, they were. I remember those bean-picker wages. As I recall, Smith, you were the best picker in the field".
Smith: "No, Jones, I beg to differ. You were the best picker".

Meaning: To divert is to get off the subject. With the original issue left unresolved, one of the disputants begins to talk of something which has no apparent evidential value for his thesis. The diversion is full (instead of merely partial) when the second party to the argument "accepts the diversion and joins in discussion or argument over the new issue.

2. Disproving a Minor Point ~

Example: Jones: I believe that the installment system of buying has been a boon to America, since (1) it has enable the ordinary man to have what has hitherto been only a luxury for the well-to-do; (2) it has raised the standard of living; (3) it has provided employment for many clerks, typists, etc., who must keep installment accounts".
Smith: "After all, the head of a gang of thieves provides gainful employment, and so any defense of installment buying on the grounds of its providing employment is silly and evades the question as whether this kind of employment is desirable. Therefore, I don’t see that you have presented any substantial reason for favoring installment buying".

Meaning: When you have, say, two or more pieces of evidence of varying degrees of importance, your opponent takes on of the less weighty of your arguments (perhaps a rather trivial point) and discredits that. He then acts as if (or attempts to create the impression that) he has disproved your whole case.

3. Ad Hominem ~

Example: Smith: "This town needs more efficient and vigorous police protection. Some on the police force should be retired and some should be fired".
Jones: "Absolutely not. And who are you to talk about improving our police protection? As I recall, 30 years ago you did time for forgery".

Meaning: Instead of attacking your proposition, your opponent directs his argument against you as a person. Although a person’s past record is something one should take into consideration, it should not be one’s sole basis for judging an argument.

The Ad Hominem attack often takes the form of discounting a proposition by attributing prejudice or bias to its supporters. But what motivates is to believe as we do, say what we say, is one thing. The truth or falsity, validity or invalidity, of what we say is another. It is possible to be prejudiced but right.

Another form of Ad Hominem is charging your opponent with the inconsistency of not living up to what he advocates.

4. Appeal to Ignorance ~

Example: "I know that man’s soul is immortal. Why? Because you can’t prove that it isn’t".

Meaning: A proposition (1) is said to be true because it has not been disproved or (2) is said to be untrue because it has not been proved.

What is not disproved on a given occasion is not necessarily true. Is a scientific theory accepted as true because you cannot disprove it? Rather, the theory must be verified positively Every person who presents a proposition in argument has the obligation to offer at least one reason in defense of it.

Likewise, your opponent’s successful attack on all premises or reasons you advance does not in all strictness make his position right or yours wrong. All he has shown is that your position is not true for your reasons. Other people, now or later, may be able to produce better reasons. Similarly, your being able to show that your adversary in his defense has involved himself in contradiction is not sufficient to prove him wrong. Smith may be arguing that the taking of life is evil, but admits that he doesn’t object to killing animals for food. There is a contradiction and confusion, but Smith may still be right that the taking of life is evil.

5. Leading Question ~

Example: (1) "It was early in the morning, wasn’t it?".

(2) "Since when have you stopped drinking?"

Meaning: A leading question is one which (1) dictates or suggests an answer or (2) one which incriminates the answerer (or places him in an undesirable position) no matter how he answers. In the first example the answer "Yes" is natural and is apt to be forthcoming, especially if the person to whom the question is addressed is highly suggestible and/or half awake. In the second example an answer in a form appropriate to the question ("Since Tuesday") would still be an admission that one did drink.

Under the second form of Leading Question may be included any question which assumes as true that which is yet controversial and undecided. "Why is it that labor leaders are so much less concerned about the general welfare than are the leaders of business?". The one to whom the question is addressed tends to ask himself, "Now why is that?", when he ought to immediately respond, "Wait a minute! Let’s settle first whether it is true that they are less concerned".

6. Complex Question ~

Example: "Do you deny that you were in the room at the time of the murder? Do you deny that you have always hated the man? Do you deny that if you couldn’t have killed him yourself you would have been glad to have someone else do the dirty job? Answer me, ‘yes’ or ‘no’".

Meaning: A series of questions are put and then the questioner demands that they be answered as a whole by either "yes" or "no". Since there is always the possibility that the answerer needs to answer each of the questions separately and differently, the complex question puts the answerer in an unfair position.

Although the questions contained in the series may each be a leading question, the complex question differs in that separate answers are not desired.

7. Inconsequent Argument ~

Example: Prosecuting Attorney: "The defendant is charged with assault and attempted robbery. There can be no doubt of this man’s guilt. In the past ten years he has been convicted 13 times on different charges of forgery, theft, and rape. (The prosecutor then goes into each of these cases in detail, He passes to the jurors documents that support what he has said about the defendant’s record) the sickening record that I have exhibited speaks for itself. Gentlemen, I ask for a verdict of ‘guilty’".

Meaning: The arguer proves or establishes something, but not what he said he would prove.

In the example given above, surely proof of a previous bad record is a far cry from proof of guilt in the offenses charged. Proof of a bad record is "inconsequential" --- of no consequence. If bad record proves guilt, then for every crime there are millions of guilty people.

Inconsequent Argument differs from Diversion is that in the latter nothing is proved, whereas in the former something has been proven, though not what the arguer was expected to prove.

8. Attacking a Straw Man ~

Example: (1) Smith: "I am opposed to capital punishment".
Jones: "I’m not".
Smith: "You ought to be. Capital punishment is unchristian".
Jones: "People like you who oppose punishing criminals nauseate me".

(2) Smith: "I am opposed to capital punishment".
Jones: "You fellows that are against capital punishment must want your daughters molested every time they leave the house!".

Meaning: Your opponent either (1) restates your position falsely or (2) exaggerates the consequences that may follow from your position.

9. Victory by Definition ~

Example: Jones: "Communism cannot help but work".
Smith: "I disagree. Look at Russia; things are in a mess there".
Jones: "Oh, sure, but that’s not real communism".
Smith: "Look at China; communism is not working there".
Jones: "They don’t have communism there either".

Meaning: A position is defined in such a way as to exclude all negative cases or adverse evidence.

Evidently Jones is defining "communism" as "that political system which cannot help but work". This certainly does not accurately report how most people use the term. Instead of destroying Smith’s position by evidence, Jones leaves him no ground for an opposing position and so destroys the argument as a whole. The same effect would have been secured if Jones had started out saying, "True communism cannot help but work".

10. Begging the Question ~

Example: (1) "Man is a social animal because he is gregarious".

(2) Jones (at the bank): "I would like a loan".
Banker: "What recommendations of references do you have, something to establish that if we loan you the money, you will pay it back?".
Jones: "Well, I can refer you to my friend Quimby; he’ll vouch for me. He’ll tell you that when I say I’ll pay, I will".
Banker: "But we don’t know Quimby, so how do we know he can be trusted?".
Jones: "Oh, I can assure you that Quimby can be trusted".

Meaning: This technique involves assuming as true what has yet to be proved. Frequently the same proposition is used both as premise and as conclusion in a single argument. This may be done either (1) by the use of synonymous terms or (2) by circular argument, which involves the use of A to prove B and B to prove A.

IV. The Experts Game [Not included here]

V. Summary

As was pointed out in the introduction, the PROPAGANDA GAME is intended to be an introduction to "clear thinking", not a completed course of study. As a followup to the game we recommend Dr Moulds' book, Thinking Straighter. Dr Moulds' book includes more comprehensive treatment of the techniques used in the PROPAGANDA GAME with added examples, a chapter on The External Marks of Authority --- Who Says It? Why Does He Say It? and What Is The Medium of the Argument? --- and a chapter on Internal Criteria of Reliability --- Documented Evidence, Sound Generalization, Internal Consistency, Impartial Treatment, Valid Deduction, and Probable Prediction. After reading this book you should be thinking straighter. You should become more accurate and precise in the use of words and more demanding of precision on the part of others. You should be more careful in drawing your own conclusions and less ready to accept at first glance the conclusions of others. The book is published by the Wm. C. Brown Co (Dubuque, IA).

In conclusion, in a free and democratic society, it is incumbent upon every citizen to be well informed on propaganda techniques. Every citizen should, therefore, play the PROPAGANDA GAME (We're sure you will want to label this technique).

VI. Suggested Answers [Not included here]

VII. Appendix [Not included here]


Stephen Downes' Guide to the Logical Fallacies
(Also published on the Internet as "Brian Yoder's Fallacy Zoo" @ http://www.primenet.com/~byoder/fallazoo.htm )

Contents ~

Fallacies of Distraction
    False Dilemma
    Argument From Ignorance (argumentum ad ignorantiam)
    Slippery Slope
    Complex Question
Appeals to Motives in Place of Support
    Appeal to Force (argumentum ad baculum)
    Appeal to Pity (argumentum ad misercordiam)
    Appeal to Consequences (argumentum ad consequentiam)
    Prejudicial Language
    Appeal to Popularity (argumentum ad populum)
Changing the Subject
    Attacking the Person (argumentum ad hominem)
    Appeal to Authority (argumentum ad verecundium)
    Anonymous Authorities
    Style Over Substance
Inductive Fallacies
    Hasty Generalization
    Unrepresentative Sample
    False Analogy
    Slothful Induction
    Fallacy of Exclusion
Fallacies Involving Statistical Syllogisms
    Accident
    Converse Accident
 


http://www.primenet.com/~byoder/distract.htm

Each of these fallacies is characterized by the illegitimate use of a logical operator in order to distract the reader from the apparent falsity of a certain proposition. The following fallacies are fallacies of distraction:

False Dilemma

Definition:
A limited number of options (usually two) is given, while in reality there are more options. A false dilemma is an illegitimate use of the "or" operator.

Examples:
(i) Either you're for me or against me.
(ii) America: love it or leave it.
(iii) Either support gun confiscation or have the government provide everyone with his own private nuclear warhead, you decide which one.

Proof:
Identify the options given and show (with an example) that there is an additional option.

References:
Cedarblom and Paulsen: 136

Argument From Ignorance (argumentum ad ignorantiam)

Definition:
Arguments of this form assume that since something has not been proven false (or cannot be), it is therefore true. Conversely, such an argument may assume that since something has not been proven true, it is therefore false. (This is a special case of a false dilemma, since it assumes that all propositions must either be known to be true or known to be false.)

As Davis writes, "Lack of proof is not proof." (p. 59)

Examples:
(i) Since you cannot prove that ghosts do not exist, therefore they must exist.
(ii) Since scientists have not proven that global warming will occur, therefore it won't.
(iii) Fred said that he is smarter than Jill, but he didn't prove it, so it must be false.

Proof:
Identify the proposition in question. Argue without evidence and proof no claims whatsoever can be derived on the subject. Such a claim is neither true nor false, but arbitrary.

References:
Copi and Cohen: 93; Davis: 59; Rand: 79

Slippery Slope

Definition:
In order to show that a proposition is unacceptable, a sequence of increasingly unacceptable events is claimed to follow from it. A slippery slope is an illegitimate compositing of the"if- then" operator. Of course this ought to be distinguished from pointing out a chain of causal consequences from a choice or position. The difference is that in a slippery slope fallacy the intermediate causal connections are unproven.

Examples:
(i) If we pass laws against private nuclear weapons, then it won't be long before we pass laws against guns, and then we will begin to restrict other rights, and finally we will end up living in a communist state. Thus, we should not ban private nuclear weapons.
(ii) You should never gamble. Once you start gambling you find it hard to stop. Soon you are spending all your money on gambling, and eventually you will turn to crime to support your earnings.
(iii) If I make an exception for you then I have to make an exception for everyone.

Proof:
Identify the proposition being refuted and identify the final event in the series of events. Then show that this final event need not occur as a
consequence of the proposition.

References:
Cedarblom and Paulsen: 137

Complex Question

Definition:
Two otherwise unrelated points are treated as a single proposition. The reader is expected to accept or reject both together, when in reality one may be acceptable while the other is not. A complex question is an illegitimate use of the "and" operator.

Examples:
(i) You should support home schooling and the God-given right of parents to raise their children according to their own beliefs. (Whether parents have a right to choose how to raise their children and whether that right includes home schooling is an entirely different issue. There is an additional complex question here since one might believe that a certain right exists but not believe it comes from God.)
(ii) Do you support freedom and the right to bear arms? (What if I think people ought to be free to bear arms but that it isn't a right? What if I think it is a right, but I don't think it matters what rights people have?)
(iii) Have you stopped beating your wife? (This implicitly asks two questions: did you beat your wife, and did you stop?)

Proof:
Identify the two propositions illegitimately conjoined and show that one doen't imply the other.

References:
Cedarblom and Paulsen: 86; Copi and Cohen: 96

Appeals to Motives in Place of Support

The fallacies in this section have in common the practice of appealing to emotions or other psychological factors. In this way, they do not provide reasons for belief, but merely "trick" people into agreeing with them one way or another without proof.

The following fallacies are appeals to motive in place of support:

Appeal to Force (argumentum ad baculum)

Definition:
The reader is threatened with unpleasant consequences if they do not agree with the author.

Examples:
(i) You had better agree that the new company policy is the best if you expect to keep your job.
(ii) You had better admit that racism is wrong or one day you might just find out how much you care about your wife and kids.
(iii) The defendant ought to be found innocent because if he isn't, there will be a riot and many innocent citizens will be hurt or killed.
(iv) Accept Jesus as your savior or face the rack and branding irons!

Proof:
Identify the threat and the proposition and argue that the threat is unrelated to the truth or falsity of the proposition.

References:
Cedarblom and Paulsen: 151, Copi and Cohen: 103

Appeal to Pity (argumentum ad misercordiam)

Definition:
The reader is told to agree to the proposition because of the pitiful state of the author.

Examples:
(i) How can you say that ball was out of bounds? It was so close, and I'm down ten games to two.
(ii) We hope you'll accept our recommendations. We spent the last three months working extra time on it and we are quite exhausted.
(iii) You ought to think highly of my term paper especially since I graduated last in my class.
(iv) You ALWAYS win these arguments. Can't you let me win just this once?

Proof:
Identify the proposition and the appeal to pity and argue that the pitiful state of the arguer has nothing to do with the truth of the proposition.

References:
Cedarblom and Paulsen: 151; Copi and Cohen; 103, Davis: 82

Appeal to Consequences (argumentum ad consequentiam)

Definition:
The author points to the disagreeable consequences of holding a particular belief in order to show that this belief is false.

Examples:
(i) You can't agree that evolution is true, because if it were, then we would be no better than the apes.
(ii) You must believe in God, otherwise life would have no meaning.
(iii) I could never agree that smoking is harmful because if I did I would have to stop.

Proof:
Identify the consequences to and argue that what we want to be the case does not affect what is in fact the case.

References:
Cedarblom and Paulsen: 100; Davis: 63

Prejudicial Language

Definition:
Loaded or emotive terms are used to attach value or moral goodness to believing the proposition or suspicion or dislike to the opposing position.

Examples:
(i) Right thinking Californians will agree with me that we should have another free vote on capital punishment.
(ii) Not only is paying a higher income tax a patriotic duty, it is also a sacred obligation.
(iii) Senator Jones "claims" that the new tax rate will reduce the deficit. (The use of "claims" implies that what Jones says is false.)
(iv) The proposal is likely to be resisted by the bureaucrats on Capitol Hill. (Compare this to: The proposal is likely to be rejected by officials on Capitol Hill.)

Proof:
Identify the prejudicial terms used (eg. "Right thinking Californians" or "sacred obligation"). Show that disagreeing with the conclusion does not make a person "wrong thinking" or "irresponsible" unless some independent proof can be offered. If they can't they are just _begging the question_.

References:
Cedarblom and Paulsen: 153, Davis: 62

Appeal to Popularity (argumentum ad populum)

Definition:
A proposition is held to be true because it is widely held to be true or is held to be true by some (usually superior) sector of the population.
This fallacy is sometimes also called the "Appeal to Emotion" because emotional appeals often sway the population as a whole.

Examples:
(i) Everyone likes beautiful people, so buy Teeth-Brite toothpaste and become beautiful. Everyone will approve of your choice.
(ii) Polls suggest that President Jones will win the election, so you may as well vote for him.
(iii) Everyone knows that the Earth is flat, so why do you persist in your outlandish claims?
(iv) Most educated people know that it is better to use paper bags than plastic ones. (An appeal to the superior group among whom the position is supposedly popular. (See also argumentum verecundium).

References:
Copi and Cohen: 103, Davis: 62

Changing the Subject

The fallacies in this section change the subject by discussing the person making the argument instead of discussing reasons to believe or disbelieve the conclusion. While on some occasions it is useful to cite authorities, it is almost never appropriate to discuss the person instead of the argument.

Attacking the Person (argumentum ad hominem)

Definition:
The person presenting an argument is attacked instead of the argument itself. This takes many forms. For example, the person's character, nationality or religion may be attacked. Alternatively, it may be pointed out that a person stands to gain from a favourable outcome. Or, finally, a person may be attacked by association, or by the company he keeps.

There are three major forms of Attacking the Person:
Ad hominem (abusive): instead of attacking an assertion, the argument attacks the person who made the assertion.
Ad hominem (circumstantial): instead of attacking an assertion the author points to the relationship between the person making the assertion and the person's circumstances.
Ad hominem (tu quoque): this form of attack on the person notes that a person does not practise what he preaches.

Examples:
(i) You may argue that God doesn't exist, but you are just a fat idiot. (ad hominem abusive)
(ii) We should discount what Steve Forbes says about cutting taxes because he stands to benefit from a lower tax rate. (ad hominem circumstantial)
(iii) We should disregard Fred's argument because he is just angry about the fact that defendant once cheated him out of $100. (ad hominem circumstantial)
(iv) You say I should give up alcohol, but you haven't been sober for more than a year yourself. (ad hominem tu quoque)
(v) You claim that Mr. Jones is innocent, but why should anyone listen to you? You are a Mormon after all. (ad hominem circumstantial)

Proof:
Identify the attack and show that the character or circumstances of the person has nothing to do with the truth or falsity of the proposition being defended.

References:
Barker: 166; Cedarblom and Paulsen: 155; Copi and Cohen: 97; Davis: 80

Appeal to Authority (argumentum ad verecundium)

Definition:
While sometimes it may be appropriate to cite an authority to support a point, often it is not. In particular, an appeal to authority is inappropriate if:
(i) the person is not qualified to have an expert opinion on the subject,
(ii) experts in the field disagree on this issue.
(iii) the authority was making a joke, drunk, under duress, or otherwise not being serious
(iv) There is no supporting evidence or argument to justify the position. If O.J. Simpson (an expert on football) insisted that footballs were made of cabbage leaves that wouldn't constitute an argument to that effect.

A variation of the fallacious appeal to authority is hearsay. An argument from hearsay is an argument which depends on second or third hand sources.

Examples:
(i) Noted psychologist Elaine Johnson recommends that you buy the EZ-Rest Hot Tub. (She is not an expert on hot tubs.)
(ii) Economist Alan Greenspan argues that going on the gold standard will lead to economic prosperity. (Although Greenspan is an expert, not all economists agree on this point, nor does his saying so make it true.)
(iii) We are headed for nuclear war. Last week Ronald Reagan remarked that we begin bombing Russia in five minutes. (Of course, he said it as a joke during a microphone test.)
(iv) My friend heard on the news the other day that The United States will declare war on Canada. (This is a case of hearsay; in fact, the reporter said that The United States would not declare war.)
(v) The Los Angeles Times reported that sales were up 8.1 percent this year. (This is hearsay; we are not in a position to check the Times' sources.)

Proof:
Point out that either (i) the person cited is not an authority in the field, or that (ii) being an expert in the field doesn't automatically make one right and insist that the argument advanced be addressed without the appeal to authority.

References:
Cedarblom and Paulsen: 155; Copi and Cohen: 95; Davis: 69

Anonymous Authorities

Definition:
The authority in question is not named. This is a type of appeal to authority because when an authority is not named it is impossible to confirm that the authority is an expert or how the conclusion was arrived at. Though this is just a type of appeal to authority, the fallacy is so common it deserves special mention.

A variation on this fallacy is the appeal to rumour. Because the source of a rumour is typically not known, it is not possible to determine whether to believe the rumour. Sometimes false and harmful rumours are deliberately started in order to discredit an opponent.

Examples:
(i) A government official said today that the new gun law will be proposed tomorrow.
(ii) Experts agree that the best way to prevent nuclear war is to prepare for it.
(iii) It is held that there are more than two million needless operations conducted every year.
(iv) Rumour has it that the President will declare a national holiday on his birthday.

Proof:
Argue that because we don't know the source of the information we have no way to evaluate the reliability of the information or whether it was derived rationally. Insist on seeing the proof for yourself.

References:
Davis: 73

Style Over Substance

Definition:
The manner in which an argument (or arguer) is presented is taken to affect the likelihood that the conclusion is true.

Examples:
(i) Nixon lost the presidential debate because of the sweat on his forehead.
(ii) Trudeau knows how to move a crowd. He must be right.
(iii) Why don't you take the advice of that nicely dressed young man?

Proof:
While it is true that the manner in which an argument is presented will affect whether people believe that its conclusion is true, nonetheless, the truth of the conclusion does not depend on the manner in which the argument is presented. In order to show that this fallacy is being committed, show that the style in this case does not affect the truth or falsity of the conclusion.

References:
Davis: 61

Inductive Fallacies

Inductive reasoning consists of inferring from the properties of a sample to the properties of a whole class of entities.

For example, suppose we have a barrel containing of 1,000 beans. Some of the beans are black and some of the beans are white. Suppose now we take a sample of 100 beans from the barrel and that 50 of them are white and 50 of them are black. Then we could infer inductively that half the beans in the barrel (that is, 500 of them) are black and half are white.

All inductive reasoning depends on the similarity of the sample and the population. The more similar the same is to the population as a whole, the more reliable will be the inductive inference. On the other hand, if the sample is relevantly dissimilar to the population, then the inductive inference will be unreliable.

Hasty Generalization

Definition:
The scope of evidence (in context of course) is too small to support the conclusion.

Examples:
(i) Fred the Australian, stole my wallet. Thus, all Australians are thieves. (Of course, we shouldn't judge all Australians on the basis of one example.)
(ii) I asked six of my friends what they thought of the new taxes and they agreed that they are a good idea. The new taxes are therefore generally popular.
(iii) All crows are black. (Even though most of the crows (or even all of them) we see are black, it would be hasty to make such a generalization given what we know about the nature of albinos.)
(iv) Pets are nice and cuddly therefore animals are generally nice and cuddly.

Proof:
Identify the importance of the issue of establishing an appropriate standard of inductive proof. Then demonstrate what the standard ought to be in this case and why the author either chose the wrong standard (or none at all) or didn't meet the correct one.

References:
Barker: 189; Cedarblom and Paulsen: 372; Davis: 103

Unrepresentative Sample

Definition:
The examples used in an inductive inference are relevantly different from the population as a whole.

Examples:
(i) To see how Americans will vote in the next election we polled a hundred people in Grenwich Village. This shows conclusively that the Democratic Party will sweep the polls. (People in Grenwich Village tend to be more liberal, and hence more likely to vote Democratic, than people in the rest of the country.)
(ii) The apples on the top of the box look good. The entire box of apples must therefore be good. (Of course, the rotten apples may be hidden beneath the surface where the moisture and darkness facilitate rotting.)

Proof:
Show how the example cases are relevantly different from the population as a whole, then show that because the examples are different, the conclusion does not follow.

References:
Barker: 188; Cedarblom and Paulsen: 226; Davis: 106

False Analogy

Definition:
In an analogy, two objects (or events), A and B are shown to be similar. Then it is argued that since A has property P, so also B must have property P. An analogy fails when the two objects, A and B, are different in a way which affects whether they both have property P.

Examples:
(i) Employees are like nails. Just as nails must be hit in the head in order to make them work, so must employees.
(ii) Government is like business, so just as business must be a money-making enterprise, so also must government. (But the objectives of government and business are completely different, so they will have to meet different criteria.)

Proof:
Identify the two objects or events being compared and the property which both are said to possess. Show that the two objects are different in a way which will affect whether they both have that property.

References:
Barker: 192; Cedarblom and Paulsen: 257; Davis: 84

Slothful Induction

Definition:
The proper conclusion of an inductive argument is denied despite the evidence to the contrary.

Examples:
(i) Hugo has had twelve car accidents in the last six months, yet he insists that it is just a coincidence and not his fault. (Inductively, the evidence is overwhelming that it is his fault. This example borrowed from Barker, p. 189)
(ii) Poll after poll shows that the N.D.P will win fewer than ten seats in Parliament. Yet the party leader insists that the party is doing much better than the polls suggest. (The N.D.P. in fact got nine seats.)
(iii) Sure that drug has been fatal in 100 previous tests, but how do you know some unknown factor wasn't present causing the deaths? Maybe the drug is perfectly safe. (This involves refusing to draw an inductive conclusion on the basis that some arbitrary assertion has not been disproven. This is the typical argument of a skeptic. They don't think they need any evidence to justify their rejection of any generalization no matter how much evidence points to the other conclusion.

Proof:
Make the relevant standard of proof clear, point out that the evidence offered does not meet it, and point out the contrary evidence not taken into account in the induction. Typically this will lead to either an agreement, a dispute over the applicability of the specified standard of proof, or the applicability of the contrary evidence. In each case the argument needs to be shown to be a rational one rather than some arbitrary choice.

I find that this kind of skepticism of any and all inductive generalization (except perhaps the ones the author is prejudiced in favor of) is the last refuge of most sloppy (and dishonest) thinkers since they can assert just about any possibility (yes, including that an omnipotant god is hiding the truth from us or that we are just brains in vats manipulated by mad scientists) to deny the validity of the inductive basis of the positions of their opponents.

References:
Barker: 189

Fallacy of Exclusion ~

Definition:
Relevant evidence which would undermine an inductive argument is excluded from consideration. The requirement that all relevant information be included is called the "principle of total evidence".

Examples:
(i) Ross Perot is over 60 years old. Most people over 60 years old make make less than $45,000/year therefore Ross Perot probably makes less than $45,000/year. (This ignores the fact that he owns billions of dollars worth of stock and other profit-making property.)
(ii) The Jets will probably win this game because they've won nine out of their last ten. (Eight of their wins came over last place teams, and today they are playing the first place team.)

Proof:
Give the missing evidence and show that it is relevant to the outcome of the inductive argument. Note that it is not sufficient simply to show that not all of the evidence was included; it must be shown that the missing evidence is relevant to the conclusion.

References:
Davis: 115

Fallacies Involving Statistical Syllogisms

A statistical generalization is a statement which is usually true, but not always true. Very often these are expressed using the word "most", as in "Most conservatives favor welfare cuts." Sometimes the word "generally" is used, as in "Conservatives generally favor welfare cuts." Or, sometimes, no specific word is used at all, as in: "Conservatives favour welfare cuts."

Fallacies involving statistical generalizations occur because the generalization is not always true. Thus, when an author treats a statistical generalization as though it were always true, the author commits a fallacy.

Accident ~

Definition:
A general rule is applied when circumstances suggest that an exception to the rule should apply.

Examples:
(i) The law says that you should not travel faster than 55 mph, thus even though your passenger was having a heart attack, you should not have travelled faster than 55 mph.
(ii) It is good to return things you have borrowed. Therefore, you should return this automatic rifle from the madman you borrowed it from. (Adapted from Plato's Republic, Book I).

Proof:
Identify the generalization in question and show that it is relevant only in a context different from the one in question. Show that the reasons for the original generalization that justified the rule don't hold in the specified case.

References:
Copi and Cohen: 100

Converse Accident ~

Definition:
An exception to a generalization is applied to cases where the generalization should apply.

Examples:
(i) Because we allow terminally ill patients to use heroin, we should allow everyone to use heroin.
(ii) Because you allowed Jill, who was hit by a truck, to hand in her assignment late, you should allow me to hand mine in late too because I was lazy and didn't get it done.

Proof:
Identify the generalization in question and show how the special case was an exception to the generalization. It helps to make the context under which the generalization was validated clear since that's typically where the mistakes are made.

References:
Copi and Cohen: 100


Excerpts: Psychological Operations Field Manual No. 33-1 , Appendix I: PSYOP Techniques; HQ, Department of the Army (31 August 1979).

Propaganda Techniques
Knowledge of propaganda techniques is necessary to improve one's own propaganda and to uncover enemy PSYOP stratagems. Techniques, however, are not substitutes for the procedures in PSYOP planning, development, or dissemination.

Techniques may be categorized as:

Characteristics of the content self-evident
No additional information is required to recognize the characteristics of this type of propaganda. "Name calling" and the use of slogans are techniques of this nature.

Additional information required to be recognized
Additional information is required by the target or analyst for the use of this technique to be recognized. "Lying" is an example of this technique. The audience or analyst must have additional information in order to know whether a lie is being told.

Evident only after extended output: "Change of pace" is an example of this technique. Neither the audience nor the analyst can know that a change of pace has taken place until various amounts of propaganda have been brought into focus.

Nature of the arguments used: An argument is a reason, or a series of reasons, offered as to why the audience should behave, believe, or think in a certain manner. An argument is expressed or implied.

Inferred intent of the originator: This technique refers to the effect the propagandist wishes to achieve on the target audience. "Divisive" and "unifying" propaganda fall within this technique. It might also be classified on the basis of the effect it has on an audience.
Self-Evident technique ~
Appeal to Authority. Appeals to authority cite prominent figures to support a position idea, argument, or course of action.

Assertion. Assertions are positive statements presented as fact. They imply that what is stated is self-evident and needs no further proof. Assertions may or may not be true.

Bandwagon and Inevitable Victory. Bandwagon-and-inevitable-victory appeals attempt to persuade the target audience to take a course of action "everyone else is taking." "Join the crowd." This technique reinforces people's natural desire to be on the winning side. This technique is used to convince the audience that a program is an expression of an irresistible mass movement and that it is in their interest to join. "Inevitable victory" invites those not already on the bandwagon to join those already on the road to certain victory. Those already, or partially, on the bandwagon are reassured that staying aboard is the best course of action.

Obtain Disapproval. This technique is used to get the audience to disapprove an action or idea by suggesting the idea is popular with groups hated, feared, or held in contempt by the target audience. Thus, if a group which supports a policy is led to believe that undesirable, subversive, or contemptible people also support it, the members of the group might decide to change their position.

Glittering Generalities. Glittering generalities are intensely emotionally appealing words so closely associated with highly valued concepts and beliefs that they carry conviction without supporting information or reason. They appeal to such emotions as love of country, home; desire for peace, freedom, glory, honor, etc. They ask for approval without examination of the reason. Though the words and phrases are vague and suggest different things to different people, their connotation is always favorable: "The concepts and programs of the propagandist are always good, desirable, virtuous."

Generalities may gain or lose effectiveness with changes in conditions. They must, therefore, be responsive to current conditions. Phrases which called up pleasant associations at one time may evoke unpleasant or unfavorable connotations at another, particularly if their frame of reference has been altered.

Vagueness. Generalities are deliberately vague so that the audience may supply its own interpretations. The intention is to move the audience by use of undefined phrases, without analyzing their validity or attempting to determine their reasonableness or application.

Rationalization. Individuals or groups may use favorable generalities to rationalize questionable acts or beliefs. Vague and pleasant phrases are often used to justify such actions or beliefs.

Simplification. Favorable generalities are used to provide simple answers to complex social, political, economic, or military problems.

Transfer. This is a technique of projecting positive or negative qualities (praise or blame) of a person, entity, object, or value (an individual, group, organization, nation, patriotism, etc.) to another in order to make the second more acceptable or to discredit it. This technique is generally used to transfer blame from one member of a conflict to another. It evokes an emotional response which stimulates the target to identify with recognized authorities.

Least of Evils. This is a technique of acknowledging that the course of action being taken is perhaps undesirable but that any alternative would result in an outcome far worse. This technique is generally used to explain the need for sacrifices or to justify the seemingly harsh actions that displease the target audience or restrict personal liberties. Projecting blame on the enemy for the unpleasant or restrictive conditions is usually coupled with this technique.

Name Calling or Substitutions of Names or Moral Labels. This technique attempts to arouse prejudices in an audience by labeling the object of the propaganda campaign as something the target audience fears, hates, loathes, or finds undesirable.

Types of name calling:

Direct name calling is used when the audience is sympathetic or neutral. It is a simple, straightforward attack on an opponent or opposing idea.

Indirect name calling is used when direct name calling would antagonize the audience. It is a label for the degree of attack between direct name calling and insinuation. Sarcasm and ridicule are employed with this technique.

Cartoons, illustrations, and photographs are used in name calling, often with deadly effect.

Dangers inherent in name calling: In its extreme form, name calling may indicate that the propagandist has lost his sense of proportion or is unable to conduct a positive campaign. Before using this technique, the propagandist must weigh the benefits against the possible harmful results. lt is best to avoid use of this device.The obstacles are formidable, based primarily on the human tendency to close ranks against a stranger. For example, a group may despise, dislike, or even hate one of its leaders, even openly criticize him, but may (and probably will) resent any nongroup member who criticizes and makes disparaging remarks against that leader.

Pinpointing the Enemy: This is a form of simplification in which a complex situation is reduced to the point where the "enemy" is unequivocally identified. For example, the president of country X is forced to declare a state of emergency in order to protect the peaceful people of his country from the brutal, unprovoked aggression by the leaders of country X.

Plain Folks or Common Man: The "plain folks" or "common man" approach attempts to convince the audience that the propagandist's positions reflect the common sense of the people. It is designed to win the confidence of the audience by communicating in the common manner and style of the audience. Propagandists use ordinary language and mannerisms (and clothes in face-to-face and audiovisual communications) in attempting to identify their point of view with that of the average person. With the plain folks device, the propagandist can win the confidence of persons who resent or distrust foreign sounding, intellectual speech, words, or mannerisms.

 The audience can be persuaded to identify its interests with those of the propagandist:

Presenting soldiers as plain folks. The propagandist wants to make the enemy feel he is fighting against soldiers who are "decent, everyday folks" much like himself; this helps to counter themes that paint the opponent as a"bloodthirsty" killer.

Presenting civilians as plain folks. The "plain folks" or "common man" device also can help to convince the enemy that the opposing nation is not composed of arrogant, immoral, deceitful, aggressive, warmongering people, but of people like himself, wishing to live at peace.

Humanizing leaders. This technique paints a more human portrait of US and friendly military and civilian leaders. It humanizes them so that the audience looks upon them as similar human beings or, preferably, as kind, wise, fatherly figures.

Categories of Plain Folk Devices:

Vernacular. This is the contemporary language of a specific region or people as it is commonly spoken or written and includes songs, idioms, and jokes. The current vernacular of the specific target audience must be used.

Dialect. Dialect is a variation in pronunciation, grammar, and vocabulary from the norm of a region or nation. When used by the propagandist, perfection is required. This technique is best left to those to whom the dialect is native, because native level speakers are generally the best users of dialects in propaganda appeals.

Errors. Scholastic pronunciation, enunciation, and delivery give the impression of being artificial. To give the impression of spontaneity, deliberately hesitate between phrases, stammer, or mispronounce words. When not overdone, the effect is one of deep sincerity. Errors in written material may be made only when they are commonly made by members of the reading audience. Generally, errors should be restricted to colloquialisms.

Homey words. Homey words are forms of "virtue words" used in the everyday life of the average man. These words are familiar ones, such as "home," "family," "children," "farm," "neighbors," or cultural equivalents. They evoke a favorable emotional response and help transfer the sympathies of the audience to the propagandist. Homey words are widely used to evoke nostalgia. Care must be taken to assure that homey messages addressed to enemy troops do not also have the same effect on US/friendly forces.

If the propaganda or the propagandist lacks naturalness, there may be an adverse backlash. The audience may resent what it considers attempts to mock it, its language, and its ways.

Social Disapproval. This is a technique by which the propagandist marshals group acceptance and suggests that attitudes or actions contrary to the one outlined will result in social rejection, disapproval, or outright ostracism. The latter, ostracism, is a control practice widely used within peer groups and traditional societies.

Virtue Words. These are words in the value system of the target audience which tend to produce a positive image when attached to a person or issue. Peace, happiness, security, wise leadership, freedom, etc., are virtue words.

Slogans. A slogan is a brief striking phrase that may include labeling and stereotyping. If ideas can be sloganized, they should be, as good slogans are self-perpetuating.

Testimonials. Testimonials are quotations, in or out of context, especially cited to support or reject a given policy, action, program, or personality. The reputation or the role (expert, respected public figure, etc.) of the individual giving the statement is exploited. The testimonial places the official sanction of a respected person or authority on a propaganda message. This is done in an effort to cause the target audience to identify itself with the authority or to accept the authority's opinions and beliefs as its own. Several types of testimonials are:

Official Sanction. The testimonial authority must have given the endorsement or be clearly on record as having approved the attributed idea, concept, action, or belief.

Four factors are involved:

Accomplishment. People have confidence in an authority who has demonstrated outstanding ability and proficiency in his field.This accomplishment should be related to the subject of the testimonial.

Identification with the target. People have greater confidence in an authority with whom they have a common bond. For example, the soldier more readily trusts an officer with whom he has undergone similar arduous experiences than a civilian authority on military subjects.

Position of authority. The official position of authority may instill confidence in the testimony; i.e., head of state, division commander, etc.

Inanimate objects. Inanimate objects may be used in the testimonial device. In such cases, the propagandist seeks to transfer physical attributes of an inanimate object to the message. The Rock of Gibraltar, for example, is a type of inanimate object associated with steadfast strength.

Personal Sources of Testimonial Authority:
Enemy leaders. The enemy target audience will generally place great value on its high level military leaders as a source of information.

Fellow soldiers. Because of their common experiences, soldiers form a bond of comradeship. As a result, those in the armed forces are inclined to pay close attention to what other soldiers have to say.

Opposing leaders. Testimonials of leaders of the opposing nation are of particular value in messages that outline war aims and objectives for administering the enemy nation after it capitulates.

Famous scholars, writers, and other personalities. Frequently, statements of civilians known to the target as authoritative or famous scholars, writers, scientists, commentators, etc., can be effectively used in propaganda messages.

Nonpersonal Sources of Testimonial Authority:

Institutions, ideologies, national flags, religious, and other nonpersonal sources are often used. The creeds, beliefs, principles, or dogmas of respected authorities or other public figures may make effective propaganda testimonials.

Factors To Be Considered:

Plausibility. The testimonial must be plausible to the target audience. The esteem in which an authority is held by the target audience will not always transfer an implausible testimonial into effective propaganda.

False testimonials. Never use false testimonials. Highly selective testimonials? Yes. Lies (fabrications)? Never. Fabricated (false) testimonials are extremely vulnerable because their lack of authenticity makes them easy to challenge and discredit.

Propaganda Techniques whihc are Based on Characteristics of the Content but which Require Additional Information on the Part of an Ananlyst to be Recognized ~
Incredible truths. There are times when the unbelievable (incredible) truth not only can but should be used.

Among these occasions are:

When the psychological operator is certain that a vitally important event will take place.

A catastrophic event, or one of significant tactical or strategic importance, unfavorable to the enemy has occurred and the news has been hidden from the enemy public or troops.

The enemy government has denied or glossed over an event detrimental to its cause.

A double-cutting edge. This technique has a double-cutting edge: It increases the credibility of the US/friendly psychological operator while decreasing the credibility of the enemy to the enemy's target audience. Advanced security clearance must be obtained before using this technique so that operations or projects will not be jeopardized or compromised. Actually, propagandists using this technique will normally require access to special compartmented information and facilities to avoid compromise of other sensitive operations or projects of agencies of the US Government. Though such news will be incredible to the enemy public, it should be given full play by the psychological operator. This event and its significance will eventually become known to the enemy public in spite of government efforts to hide it. The public will recall (the psychological operator will "help" the recall process) that the incredible news was received from US/allied sources. They will also recall the deception of their government. The prime requirement in using this technique is that the disseminated incredible truth must be or be certain to become a reality.

Insinuation. Insinuation is used to create or stir up the suspicions of the target audience against ideas, groups, or individuals in order to divide an enemy. The propagandist hints, suggests, and implies, allowing the audience to draw its own conclusions. Latent suspicions and cleavages within the enemy camp are exploited in an attempt to structure them into active expressions of disunity which weaken the enemy's war effort.

Exploitable vulnerabilities. Potential cleavages which may be exploited include the following:

Political differences between the enemy nation and its allies or satellites.

Ethnic and regional differences.

Religious, political, economic, or social differences.

History of civilian animosity or unfair treatment toward enemy soldiers.

Comforts available to rear area soldiers and not available to combat soldiers.

People versus the bureaucracy or hierarchy.

Political differences between the ruling elite, between coalitions members, or between rulers and those out of power.

Differences showing a few benefiting at the expense of the general populace.

Unequal or inequitable tax burdens, or the high level of taxes. The audience should be informed of hidden taxes.

The scarcity of consumer goods for the general public and their availability to the various elites and the dishonest.

Costs of present government policies in terms of lost opportunities to accomplish constructive socially desirable goals.

The powerlessness of the individual. (This may be used to split the audience from the policies of its government by disassociating its members from those policies.) This technique could be used in preparing a campaign to gain opposition to those government policies.

Insinuation devices. A number of devices are available to exploit these and similar vulnerabilities:
Leading questions: The propagandist may ask questions which suggest only one possible answer. Thus, the question, "What is there to do now that your unit is surrounded and you are completely cut off?" insinuates that surrender or desertion is the only reasonable alternative to annihilation.

Humor: Humor can be an effective form of insinuation. Jokes and cartoons about the enemy find a ready audience among those persons in the target country or military camp who normally reject straightforward accusations or assertions. Jokes about totalitarian leaders and their subordinates often spread with ease and rapidity. However, the psychological operator must realize that appreciation of humor differs among target groups and so keep humor within the appropriate cultural context.

Pure motives. This technique makes it clear that the side represented by the propagandist is acting in the best interests of the target audience, insinuating that the enemy is acting to the contrary. For example, the propagandist can use the theme that a satellite force fighting on the side of the enemy is insuring the continued subjugation of its country by helping the common enemy.

Guilt by association: Guilt by association links a person, group, or idea to other persons, groups, or ideas repugnant to the target audience. The insinuation is that the connection is not mutual, accidental, or superficial.

Rumor: Malicious rumors are also a potentially effective form of insinuation.

Pictorial and photographic propaganda: A photograph, picture, or cartoon can often insinuate a derogatory charge more effectively than words. The combination of words and photograph, picture, or cartoon can be far more effective. In this content, selected and composite photographs can be extremely effective .

Vocal: Radio propagandists can artfully suggest a derogatory notion, not only with the words they use, but also by the way in which they deliver them. Significant pauses, tonal inflections, sarcastic pronunciation, ridiculing enunciation, can be more subtle than written insinuation.

Card stacking or selective omission. This is the process of choosing from a variety of facts only those which support the propagandist's purpose.

In using this technique, facts are selected and presented which most effectively strengthen and authenticate the point of view of the propagandist. It includes the collection of all available material pertaining to a subject and the selection of that material which most effectively supports the propaganda line. Card stacking, case making, and censorship are all forms of selection. Success or failure depends on how successful the propagandist is in selecting facts or "cards" and presenting or "stacking" them.

Increase prestige. In time of armed conflict, leading personalities, economic and social systems, and other institutions making up a nation are constantly subjected to propaganda attacks. Card stacking is used to counter these attacks by publicizing and reiterating the best qualities of the institutions, concepts, or persons being attacked. Like most propaganda techniques, card stacking is used to supplement other methods.

The technique may also be used to describe a subject as virtuous or evil and to give simple answers to a complicated subject.

An intelligent propagandist makes his case by imaginative selection of facts.The work of the card stacker in using selected facts is divided into two main phases:

First, the propagandist selects only favorable facts and presents them to the target in such a manner as to obtain a desired reaction.

Second, the propagandist uses these facts as a basis for conclusions, trying to lead the audience into accepting the conclusions by accepting the facts presented.

Presenting the other side. Some persons in a target audience believe that neither belligerent is entirely virtuous. To them propaganda solely in terms of right and wrong may not be credible. Agreement with minor aspects of the enemy's point of view may overcome this cynicism. Another use of presenting the other side is to reduce the impact of propaganda that opposing propagandists are likely to be card stacking (selective omission).

Lying and distortion. Lying is stating as truth that which is contrary to fact. For example, assertions may be lies. This technique will not be used by US personnel. It is presented for use of the analyst of enemy propaganda.

Simplification. This is a technique in which the many facts of a situation are reduced so the right or wrong, good or evil, of an act or decision is obvious to all. This technique (simplification) provides simple solutions for complex problems. By suggesting apparently simple solutions for complex problems, this technique offers simplified interpretations of events, ideas, concepts, or personalities. Statements are positive and firm;qualifying words are never used.

Simplification may be used to sway uneducated and educated audiences. This is true because many persons are well educated or highly skilled, trained specialists in a specific field, but the limitations of time and energy often force them to turn to and accept simplifications to understand, relate, and react to other areas of interest.

Simplification has the following characteristics:

It thinks for others: Some people accept information which they cannot verify personally as long as the source is acceptable to them or the authority is considered expert. Others absorb whatever they read, see, or hear with little or no discrimination. Some people are too lazy or unconcerned to think problems through. Others are uneducated and willingly accept convenient simplifications.

It is concise: Simplification gives the impression of going to the heart of the matter in a few words. The average member of the target audience will not even consider that there may be another answer to the problem.

It builds ego: Some people are reluctant to believe that any field of endeavor, except their own, is difficult to understand. For example, a layman is pleased to hear that '"law is just common sense dressed up in fancy language," or "modern art is really a hodgepodge of aimless experiment or nonsense." Such statements reinforce the ego of the lay audience. It is what they would like to believe, because they are afraid that law and modern art may actually be beyond their understanding. Simple explanations are given for complex subjects and problems.
Stereotyping is a form of simplification used to fit persons, groups, nations, or events into readymade categories that tend to produce a desired image of good or bad. Stereotyping puts the subject (people, nations, etc.) or event into a simplistic pattern without any distinguishing individual characteristics.
Characteristics of Content which may become Evident when Numerous Pieces of Output are Examined ~
Change of Pace. Change of pace is a technique of switching from belligerent to peaceful output, from "hot" to "cold," from persuasion to threat, from gloomy prophecy to optimism, from emotion to fact.

Stalling. Stalling is a technique of deliberately withholding information until its timeliness is past, thereby reducing the possibility of undesired impact.

Shift of Scene. With this technique, the propagandist replaces one "field of battle" with another. It is an attempt to take the spotlight off an unfavorable situation or condition by shifting it to another, preferably of the opponent, so as to force the enemy to go on the defense.

Repetition ~

An idea or position is repeated in an attempt to elicit an almost automatic response from the audience or to reinforce an audience's opinion or attitude. This technique is extremely valid and useful because the human being is basically a creature of habit and develops skills and values by repetition (like walking, talking, code of ethics, etc.). An idea or position may be repeated many times in one message or in many messages. The intent is the same in both instances, namely, to elicit an immediate response or to reinforce an opinion or attitude.

The audience is not familiar with the details of the threat posed. Ignorance of the details can be used to pose a threat and build fear.

Members of the audience are self-centered.

The target can take immediate action to execute simple, specific instructions.

Fear of change. People fear change, particularly sudden, imposed change over which they have no control. They fear it will take from them status, wealth, family, friends, comfort, safety, life, or limb. That's why the man in the foxhole hesitates to leave it. He knows and is accustomed to the safety it affords. He is afraid that moving out of his foxhole will expose him to new and greater danger. That is why the psychological campaign must give him a safe, honorable way out of his predicament or situation.

Terrorism. The United States is absolutely opposed to the use of terror or terror tactics. But the psychological operator can give a boomerang effect to enemy terror, making it reverberate against the practitioner, making him repugnant to his own people, and all others who see the results of his heinous savagery. This can be done by disseminating fully captioned photographs in the populated areas of the terrorist's homeland. Such leaflets will separate civilians from their armed forces; it will give them second thoughts about the decency and honorableness of their cause, make them wonder about the righteousness of their ideology, and make the terrorists repugnant to them. Followup leaflets can "fire the flames" of repugnancy, indignation, and doubt, as most civilizations find terror repugnant.

In third countries. Fully captioned photographs depicting terroristic acts may be widely distributed in third countries (including the nation sponsoring the enemy) where they will instill a deep revulsion in the general populace. Distribution in neutral countries is particularly desirable in order to swing the weight of unbiased humanitarian opinion against the enemy.

The enemy may try to rationalize and excuse its conduct (terroristic), but in so doing, it will compound the adverse effect of its actions, because it can never deny the validity of true photographic representations of its acts. Thus, world opinion will sway to the side of the victimized people.

Friendly territory. Under no circumstances should such leaflets be distributed in friendly territory. To distribute them in the friendly area in which the terrorists' acts took place would only create feelings of insecurity. This would defeat the purpose of the psychological operator, which is to build confidence in the government or agency he represents.




Twenty-Five Ways To Suppress Truth: The Rules of Disinformation
by H. Michael Sweeney
<HMS@proparanoid.com>
(c) 1997, 2000, 2001 All rights reserved

Permission to reprint/distribute hereby granted for any non commercial use provided information reproduced in its entirety and with author information in tact. For more Intel/Shadow government related info, visit the Author's Web site: http://www.proparanoid.com

Built upon Thirteen Techniques for Truth Suppression by David Martin, the following may be useful to the initiate in the world of dealing with veiled and half-truth, lies, and suppression of truth when serious crimes are studied in public forums. This, sadly, includes every day news media, one of the worst offenders with respect to being a source of disinformation. Where the crime involves a conspiracy, or a conspiracy to cover up the crime, there will invariably be a disinformation campaign launched against those seeking to uncover and expose the truth and/or the conspiracy. There are specific tactics which disinfo artists tend to apply, as revealed here. Also included with this material are seven common traits of the disinfo artist which may also prove useful in identifying players and motives. The more a particular party fits the traits and is guilty of following the rules, the more likely they are a professional disinfo artist with a vested motive. People can be bought, threatened, or blackmailed into providing disinformation, so even "good guys" can be suspect in many cases.

A rational person participating as one interested in the truth will evaluate that chain of evidence and conclude either that the links are solid and conclusive, that one or more links are weak and need further development before conclusion can be arrived at, or that one or more links can be broken, usually invalidating (but not necessarily so, if parallel links already exist or can be found, or if a particular link was merely supportive, but not in itself key) the argument. The game is played by raising issues which either strengthen or weaken (preferably to the point of breaking) these links. It is the job of a disinfo artist to interfere with these evaluation... to at least make people think the links are weak or broken when, in truth, they are not... or to propose alternative solutions leading away from the truth. Often, by simply impeding and slowing down the process through disinformation tactics, a level of victory is assured because apathy increases with time and rhetoric.

It would seem true in almost every instance, that if one cannot break the chain of evidence for a given solution, revelation of truth has won out. If the chain is broken either a new link must be forged, or a whole new chain developed, or the solution is invalid an a new one must be found... but truth still wins out. There is no shame in being the creator or supporter of a failed solution, chain, or link, if done with honesty in search of the truth. This is the rational approach. While it is understandable that a person can become emotionally involved with a particular side of a given issue, it is really unimportant who wins, as long as truth wins. But the disinfo artist will seek to emotionalize and chastise any failure (real or false claims thereof), and will seek by means of intimidation to prevent discussion in general.

Twenty-Five Rules of Disinformation ~

1. Hear no evil, see no evil, speak no evil
2. Become incredulous and indignant
3. Create rumor mongers
4. Use a straw man
5. Sidetrack opponents w name calling, ridicule
6. Hit and Run
7. Question motives
8. Invoke authority
9. Play Dumb
10. Associate opponent charges with old news
11. Establish and rely upon fall-back positions
12. Enigmas have no solution
13. Alice in Wonderland Logic
14. Demand complete solutions
15. Fit the facts to alternate conclusions
16. Vanish evidence and witnesses
17. Change the subject
18. Emotionalize, Antagonize, and Goad
19. Ignore facts, demand impossible proofs
20. False evidence
21. Call a Grand Jury, Special Prosecutor
22. Manufacture a new truth
23. Create bigger distractions
24. Silence critics
25. Vanish

Eight Traits of The Disinformationalist ~

1. Avoidance
2. Selectivity
3. Coincidental
4. Teamwork
5. Anti-conspiratorial
6. Artificial Emotions
7. Inconsistent
8. Newly Discovered: Time Constant

It is the disinfo artist and those who may pull their strings (those who stand to suffer should the crime be solved) MUST seek to prevent rational and complete examination of any chain of evidence which would hang them. Since fact and truth seldom fall on their own, they must be overcome with lies and deceit. Those who are professional in the art of lies and deceit, such as the intelligence community and the professional criminal (often the same people or at least working together), tend to apply fairly well defined and observable tools in this process. However, the public at large is not well armed against such weapons, and is often easily led astray by these time-proven tactics. Remarkably, not even media and law enforcement have NOT BEEN TRAINED to deal with these issues. For the most part, only the players themselves understand the rules of the game.

This why concepts from the film, Wag-The-Dog, actually work. If you saw that movie, know that there is at least one real-world counterpart to Al Pacino's character. For CIA, it is Mark Richards, who was called in to orchestrate the media response to Waco on behalf of Janet Reno. Mark Richards is the acknowledged High Priest of Disinformation. His appointment was extremely appropriate, since the CIA was VERY present at Waco from the very beginning of the cult to the very end of their days --- just as it was at the People's Temple in Jonestown. Richards purpose in life is damage control.

For such disinformationalists, the overall aim is to avoid discussing links in the chain of evidence which cannot be broken by truth, but at all times, to use clever deceptions or lies to make select links seem weaker than they are, create the illusion of a break, or better still, cause any who are considering the chain to be distracted in any number of ways, including the method of questioning the credentials of the presenter. Please understand that fact is fact, regardless of the source. Likewise, truth is truth, regardless of the source. This is why criminals are allowed to testify against other criminals. Where a motive to lie may truly exist, only actual evidence that the testimony itself IS a lie renders it completely invalid. Were a known 'liar's' testimony to stand on its own without supporting fact, it might certainly be of questionable value, but if the testimony (argument) is based on verifiable or otherwise demonstrable facts, it matters not who does the presenting or what their motives are, or if they have lied in the past or even if motivated to lie in this instance -- the facts or links would and should stand or fall on their own merit and their part in the matter will merely be supportive.

Moreover, particularly with respects to public forums such as newspaper letters to the editor, and Internet chat and news groups, the disinfo type has a very important role. In these forums, the principle topics of discussion are generally attempts by individuals to cause other persons to become interested in their own particular position, idea, or solution -- very much in development at the time. People often use such mediums as a sounding board and in hopes of pollination to better form their ideas. Where such ideas are critical of government or powerful, vested groups (especially if their criminality is the topic), the disinfo artist has yet another role -- the role of nipping it in the bud. They also seek to stage the concept, the presenter, and any supporters as less than credible should any possible future confrontation in more public forums result due to their early successes. You can often spot the disinfo types at work here by the unique application of "higher standards" of discussion than necessarily warranted. They will demand that those presenting arguments or concepts back everything up with the same level of expertise as a professor, researcher, or investigative writer. Anything less renders any discussion meaningless and unworthy in their opinion, and anyone who disagrees is obviously stupid -- and they generally put it in exactly those terms.

So, as you read any such discussions, particularly so in Internet news groups (NG), decide for yourself when a rational argument is being applied and when disinformation, psyops (psychological warfare operations) or trickery is the tool. Accuse those guilty of the later freely. They (both those deliberately seeking to lead you astray, and those who are simply foolish or misguided thinkers) generally run for cover when thus illuminated, or -- put in other terms, they put up or shut up (a perfectly acceptable outcome either way, since truth is the goal.) Here are the twenty-five methods and seven traits, some of which don't apply directly to NG application. Each contains a simple example in the form of actual (some paraphrased for simplicity) from NG comments on commonly known historical events, and a proper response. Accusations should not be overused -- reserve for repeat offenders and those who use multiple tactics. Responses should avoid falling into emotional traps or informational sidetracks, unless it is feared that some observers will be easily dissuaded by the trickery. Consider quoting the complete rule rather than simply citing it, as others will not have reference. Offer to provide a complete copy of the rule set upon request (see permissions statement at end):

Twenty-Five Rules of Disinformation ~

Note: The first rule and last five (or six, depending on situation) rules are generally not directly within the ability of the traditional disinfo artist to apply. These rules are generally used more directly by those at the leadership, key players, or planning level of the criminal conspiracy or conspiracy to cover up.

(1) Hear No Evil, See No Evil, Speak No Evil ~ Regardless of what you know, don't discuss it -- especially if you are a public figure, news anchor, etc. If it's not reported, it didn't happen, and you never have to deal with the issues.

Example: Media was present in the courtroom (Hunt vs. Liberty Lobby) when CIA agent Marita Lorenz 'confession' testimony regarding CIA direct participation in the planning and assassination of John Kennedy was revealed. All media reported was that E. Howard Hunt lost his libel case against Liberty Lobby (Liberty Lobby's newspaper, The Spotlight, had reported Hunt was in Dallas that day and were sued for the story). See Mark Lane's remarkable book, Plausible Denial, for the full confessional transcript.

Proper response: There is no possible response unless you are aware of the material and can make it public yourself.. In any such attempt, be certain to target any known silent party as likely complicit in a cover up. In this case, it would be the entire Time-Warner Media Group, among others. This author is relatively certain that reporters were hand-picked to cover this case from among those having intelligence community ties.

(2) Become Incredulous and Indignant ~ Avoid discussing key issues and instead focus on side issues which can be used show the topic as being critical of some otherwise sacrosanct group or theme. This is also known as the 'How dare you!' gambit.

Example: 'How dare you suggest that the Branch Davidians were murdered! the FBI and BATF are made up of America's finest and best trained law enforcement, operate under the strictest of legal requirements, and are under the finest leadership the President could want to appoint.'

Proper response: You are avoiding the Waco issue with disinformation tactics. Your high opinion of FBI is not founded in fact. All you need do is examine Ruby Ridge and any number of other examples, and you will see a pattern of abuse of power that demands attention to charges against FBI/BATF at Waco. Why do you refuse to address the issues with disinformation tactics (rule 2 - become incredulous and indignant)?

(3) Create Rumor Mongers ~ Avoid discussing issues by describing all charges, regardless of venue or evidence, as mere rumors and wild accusations. Other derogatory terms mutually exclusive of truth may work as well. This method which works especially well with a silent press, because the only way the public can learn of the facts are through such 'arguable rumors'. If you can associate the material with the Internet, use this fact to certify it a 'wild rumor' from a 'bunch of kids on the Internet' which can have no basis in fact.

'You can't prove his material was legitimately from French Intelligence. Pierre Salinger had a chance to show his 'proof' that flight 800 was brought down by friendly fire, and he didn't. All he really had was the same old baseless rumor that's been floating around the Internet for months.'

Proper response: You are avoiding the issue with disinformation tactics. The Internet charge reported widely is based on a single FBI interview statement to media and a similar statement by a Congressman, neither of which had actually seen Pierre's document. As the FBI is being accused in participating in a cover up of this matter and Pierre claims his material is not Internet sourced, it is natural that FBI would have reason to paint his material in a negative light. For you to assume the FBI to have no bias in the face of Salinger's credentials and unchanged stance suggests you are biased. At the best you can say the matter is in question. Further, to imply that material found on Internet is worthless is not founded. At best you may say it must be considered carefully before accepting it, which will require addressing the actual issues. Why do you refuse to address these issues with disinformation tactics (rule 3 - create rumor mongers)?

(4) Use a Straw Man ~ Find or create a seeming element of your opponent's argument which you can easily knock down to make yourself look good and the opponent to look bad. Either make up an issue you may safely imply exists based on your interpretation of the opponent/opponent arguments/situation, or select the weakest aspect of the weakest charges. Amplify their significance and destroy them in a way which appears to debunk all the charges, real and fabricated alike, while actually avoiding discussion of the real issues.

Example: When trying to defeat reports by the Times of London that spy-sat images reveal an object racing towards and striking flight 800, a straw man is used. The disinformationalist, later identified as having worked for Naval Intelligence, simply stated: 'If these images exist, the public has not seen them. Why? They don't exist, and never did. You have no evidence and thus, your entire case falls flat.'

Proper response: 'You are avoiding the issue with disinformation tactics. You imply deceit and deliberately establish an impossible and unwarranted test. It is perfectly natural that the public has not seen them, nor will they for some considerable time, if ever. To produce them would violate national security with respect to intelligence gathering capabilities and limitations, and you should know this. Why do you refuse to address the issues with such disinformation tactics (rule 4 - use a straw man)?'

(5) Sidetrack Opponents with Name-Calling and Ridicule ~ This is also known as the primary 'attack the messenger' ploy, though other methods qualify as variants of that approach. Associate opponents with unpopular titles such as 'kooks', 'right-wing', 'liberal', 'left-wing', 'terrorists', 'conspiracy buffs', 'radicals', 'militia', 'racists', 'religious fanatics', 'sexual deviates', and so forth. This makes others shrink from support out of fear of gaining the same label, and you avoid dealing with issues.

Example: 'You believe what you read in the Spotlight? The Publisher, Willis DeCarto, is a well-known right-wing racist. I guess we know your politics -- does your Bible have a swastika on it? That certainly explains why you support this wild-eyed, right-wing conspiracy theory.'

Proper response: 'You are avoiding the issue with disinformation tactics. Your imply guilt by association and attack truth on the basis of the messenger. The Spotlight is well known Populist media source responsible for releasing facts and stories well before mainstream media will discuss the issues through their veil of silence. Willis DeCarto has successfully handled lawsuits regarding slanderous statements such as yours. Your undemonstrated charges against the messenger have nothing to do with the facts or the issues, and fly in the face of reason. Why do you refuse to address the issues by use of such disinformation tactics (rule 5 - sidetrack opponents with name calling and ridicule)?'

(6) Hit and Run ~ In any public forum, make a brief attack of your opponent or the opponent position and then scamper off before an answer can be fielded, or simply ignore any answer. This works extremely well in Internet and letters-to-the-editor environments where a steady stream of new identities can be called upon without having to explain criticism reasoning -- simply make an accusation or other attack, never discussing issues, and never answering any subsequent response, for that would dignify the opponent's viewpoint.

Example: ''This stuff is garbage. Where do you conspiracy lunatics come up with this crap? I hope you all get run over by black helicopters.' Notice it even has a farewell sound to it, so it won't seem curious if the author is never heard from again.

Proper response: 'You are avoiding the issue with disinformation tactics. Your comments or opinions fail to offer any meaningful dialog or information, and are worthless except to pander to emotionalism, and in fact, reveal you to be emotionally insecure with these matters. If you do not like reading 'this crap', why do you frequent this NG which is clearly for the purpose of such discussion? Why do you refuse to address the issues by use of such disinformation tactics (rule 6 - hit and run)?'

(7) Question Motives ~ Twist or amplify any fact which could be taken to imply that the opponent operates out of a hidden personal agenda or other bias. This avoids discussing issues and forces the accuser on the defensive.

Example: 'With the talk-show circuit and the book deal, it looks like you can make a pretty good living spreading lies.'

Proper response: 'You are avoiding the issue with disinformation tactics. Your imply guilt as a means of attacking the messenger or his credentials, but cowardly fail to offer any concrete evidence that this is so. If you think what has been presented are 'lies', why not simply so illustrate? Why do you refuse to address the issues by use of such disinformation tactics (rule 6 - question motives)?'

(8) Invoke Authority ~ Claim for yourself or associate yourself with authority and present your argument with enough 'jargon' and 'minutia' to illustrate you are 'one who knows', and simply say it isn't so without discussing issues or demonstrating concretely why or citing sources.

'You obviously know nothing about either the politics or strategic considerations, much less the technicals of the SR-71. Incidentally, for those who might care, that sleek plane is started with a pair of souped up big-block V-8's (originally, Buick 454 C.I.D. with dual 450 CFM Holly Carbs and a full-race Isky cams -- for 850 combined BHP @ 6,500 RPM) using a dragster-style clutch with direct-drive shaft. Anyway, I can tell you with confidence that no Blackbird has ever been flown by Korean nationals nor have they ever been trained to fly it, and have certainly never overflown the Republic of China in a SR or even launched a drone from it that flew over China. I'm not authorized to discuss if there have been overflights by American pilots.'

Proper response: 'You are avoiding the issue with disinformation tactics. Your imply your own authority and expertise but fail to provide credentials, and you also fail to address issues and cite sources. You simply cite 'Jane's-like' information to make us think you know what you are talking about. Why do you refuse to address the issues by use of such disinformation tactics (rule 8 - invoke authority)?'

(9) Play Dumb ~ No matter what evidence or logical argument is offered, avoid discussing issues except with denials they have any credibility, make any sense, provide any proof, contain or make a point, have logic, or support a conclusion. Mix well for maximum effect.

Example: 'Nothing you say makes any sense. Your logic is idiotic. Your facts nonexistent. Better go back to the drawing board and try again.'

Proper response: 'You are avoiding the issue with disinformation tactics. You evade the issues with your own form of nonsense while others, perhaps more intelligent than you pretend to be, have no trouble with the material. Why do you refuse to address the issues by use of such disinformation tactics (Rule 9 - play dumb)?'

(10) Associate Opponent Charges with Old News ~ A derivative of the straw man -- usually, in any large-scale matter of high visibility, someone will make charges early on which can be or were already easily dealt with - a kind of investment for the future should the matter not be so easily contained.) Where it can be foreseen, have your own side raise a straw man issue and have it dealt with early on as part of the initial contingency plans. Subsequent charges, regardless of validity or new ground uncovered, can usually then be associated with the original charge and dismissed as simply being a rehash without need to address current issues -- so much the better where the opponent is or was involved with the original source.

Example: 'Flight 553's crash was pilot error, according to the NTSB findings. Digging up new witnesses who say the CIA brought it down at a selected spot and were waiting for it with 50 agents won't revive that old dead horse buried by NTSB more than twenty years ago.'

Proper response: 'You are avoiding the issue with disinformation tactics. Your ignore the issues and imply they are old charges as if new information is irrelevant to truth. Why do you refuse to address the issues by use of such disinformation tactics (rule 10 - associate charges with old news)?'

(11) Establish and Rely Upon Fall-Back Positions ~ Using a minor matter or element of the facts, take the 'high road' and 'confess' with candor that some innocent mistake, in hindsight, was made -- but that opponents have seized on the opportunity to blow it all out of proportion and imply greater criminalities which, 'just isn't so.' Others can reinforce this on your behalf, later, and even publicly 'call for an end to the nonsense' because you have already 'done the right thing.' Done properly, this can garner sympathy and respect for 'coming clean' and 'owning up' to your mistakes without addressing more serious issues.

Example: 'Reno admitted in hindsight she should have taken more time to question the data provided by subordinates on the deadliness of CS-4 and the likely Davidian response to its use, but she was so concerned about the children that she elected, in what she now believes was a sad and terrible mistake, to order the tear gas be used.'

Proper response: 'You are avoiding the issue with disinformation tactics. Your evade the true issue by focusing on a side issue in an attempt to evoke sympathy. Perhaps you did not know that CIA Public Relations expert Mark Richards was called in to help Janet Reno with the Waco aftermath response? How warm and fuzzy it makes us feel, so much so that we are to ignore more important matters being discussed. Why do you refuse to address the issues by use of such disinformation tactics (rule 11 - establish and rely upon fall-back positions)?'

(12) Enigmas Have No Solution ~ Drawing upon the overall umbrella of events surrounding the crime and the multitude of players and events, paint the entire affair as too complex to solve. This causes those otherwise following the matter to begin to loose interest more quickly without having to address the actual issues.

Example: 'I don't see how you can claim Vince Foster was murdered since you can't prove a motive. Before you could do that, you would have to completely solve the whole controversy over everything that went on in the White House and in Arkansas, and even then, you would have to know a heck of a lot more about what went on within the NSA, the Travel Office, and the secret Grand Jury, and on, and on, and on. It's hopeless. Give it up.'

Proper response: 'You are avoiding the issue with disinformation tactics. Your completely evade issues and attempt others from daring to attempt it by making it a much bigger mountain than necessary. You eat an elephant one bite at a time. Why do you refuse to address the issues by use of such disinformation tactics (rule 12 - enigmas have no solution)?'

(13) Alice in Wonderland Logic ~  Avoid discussion of the issues by reasoning backwards or with an apparent deductive logic which forbears any actual material fact.

Example: 'The news media operates in a fiercely competitive market where stories are gold. This means they dig, dig, dig for the story -- often doing a better job than law enforcement. If there was any evidence that BATF had prior knowledge of the Oklahoma City bombing, they would surely have uncovered it and reported it. They haven't reported it, so there can't have been any prior knowledge. Put up or shut up.'

Proper response: 'You are avoiding the issue with disinformation tactics. Your backwards logic does not work here. Has media reported CIA killed Kennedy when they knew it? No, despite their presence at a courtroom testimony 'confession' by CIA operative Marita Lornez in a liable trial between E. Howard Hunt and Liberty Lobby, they only told us the trial verdict. THAT, would have been the biggest story of the Century, but they didn't print it, did they? Why do you refuse to address the issues by use of such disinformation tactics (rule 13 - Alice in Wonderland logic)?'

(14) Demand Complete Solutions Avoid the issues by requiring opponents to solve the crime at hand completely, a ploy which works best with issues qualifying for rule 10.

Example: 'Since you know so much, if James Earl Ray is as innocent as you claim, who really killed Martin Luther King, how was it planned and executed, how did they frame Ray and fool the FBI, and why?'

Proper response: You are avoiding the issue with disinformation tactics. It is not necessary to completely resolve any full matter in order to examine any relative attached issue. Discussion of any evidence of Ray's innocence can stand alone to serve truth, and any alternative solution to the crime, while it may bolster that truth, can also stand alone. Why do you refuse to address the issues by use of such disinformation tactics (rule 14 - demand complete solutions)?

(15) Fit the Facts to Alternate Conclusions ~ This requires creative thinking unless the crime was planned with contingency conclusions in place.

Example: 'The cargo door failed on Flight 800 and caused a catastrophic breakup which ruptured the fuel tank and caused it to explode.'

Proper response: The best definitive example of avoiding issues by this technique is, perhaps, Arlan Specter's Magic Bullet from the Warren Report. This was eloquently defeated in court but media blindly accepted it without challenge. Thus rewarded, disinformationalists do not shrink from its application, even though today, thanks in part to the movie, JFK, most Americans do now understand it was fabricated nonsense. Thus the defense which works best may actually be to cite the Magic Bullet. 'You are avoiding the issue with disinformation tactics. Your imaginative twisting of facts rivals that of Arlan Specter's Magic Bullet in the Warren Report. We all know why the impossible magic bullet was invented. You invent a cargo door problem when there has been not one shred of evidence from the crash investigation to support it, and in fact, actual photos of the cargo door hinges and locks disprove you. Why do you refuse to address the issues by use of such disinformation tactics (rule 15 - fit facts to an alternate conclusion)?'

(16) Vanish Evidence and Witnesses ~ If it does not exist, it is not fact, and you won't have to address the issue.

Example: 'You can't say Paisley is still alive... that his death was faked and the list of CIA agents found on his boat deliberately placed there to support a purge at CIA. You have no proof. Why can't you accept the Police reports?' This is a good ploy, since the dental records and autopsy report showing his body was two inches too long and the teeth weren't his were lost right after his wife demanded inquiry, and since his body was cremated before she could view it -- all that remains are the Police Reports. Handy.

Proper response: There is no suitable response to actual vanished materials or persons, unless you can shed light on the matter, particularly if you can tie the event to a cover up other criminality. However, with respect to dialog where it is used against the discussion, you can respond... 'You are avoiding the issue with disinformation tactics. The best you can say is that the matter is in contention ONLY because of highly suspicious matters such as the simultaneous and mysterious vanishing of three sets of evidence. The suspicious nature itself tends to support the primary allegation. Why do you refuse to address the remaining issues by use of such disinformation tactics (rule 16 - vanish evidence and witnesses)?'

(17) Change the Subject ~ Usually in connection with one of the other ploys listed here, find a way to side-track the discussion with abrasive or controversial comments in hopes of turning attention to a new, more manageable topic. This works especially well with companions who can 'argue' with you over the new topic and polarize the discussion arena in order to avoid discussing more key issues.

Example: 'There were no CIA drugs and was no drug money laundering through Mena, Arkansas, and certainly, there was no Bill Clinton knowledge of it because it simply didn't happen. This is merely an attempt by his opponents to put Clinton off balance and at a disadvantage in the election: Dole is such a weak candidate with nothing to offer that they are desperate to come up with something to swing the polls. Dole simply has no real platform.' Assistant's response. 'You idiot! Dole has the clearest vision of what's wrong with Government since McGovern. Clinton is only interested in raping the economy, the environment, and every woman he can get his hands on...' One naturally feels compelled, regardless of party of choice, to jump in defensively on that one...

Proper response: 'You are both avoiding the issue with disinformation tactics. Your evade discussion of the issues by attempting to sidetrack us with an emotional response to a new topic -- a trap which we will not fall into willingly. If you truly believe such political rhetoric, please drop out of this discussion, as it is not germane, and take it to one of the more appropriate politics NGs. Why do you refuse to address the issues by use of such disinformation tactics (rule 17- change the subject)?'

(18) Emotionalize, Antagonize, and Goad Opponents ~ If you can't do anything else, chide and taunt your opponents and draw them into emotional responses which will tend to make them look foolish and overly motivated, and generally render their material somewhat less coherent. Not only will you avoid discussing the issues in the first instance, but even if their emotional response addresses the issue, you can further avoid the issues by then focusing on how 'sensitive they are to criticism.'

Example: 'You are such an idiot to think that possible -- or are you such a paranoid conspiracy buff that you think the 'gubment' is cooking your pea-brained skull with microwaves, which is the only justification you might have for dreaming up this drivel.' After a drawing an emotional response: 'Ohhh... I do seem to have touched a sensitive nerve. Tsk, tsk. What's the matter? The truth too hot for you to handle? Perhaps you should stop relying on the Psychic Friends Network and see a psychiatrist for some real professional help...'

Proper response: 'You are avoiding the issue with disinformation tactics. You attempt to draw me into emotional response without discussion of the issues. If you have something useful to contribute which defeats my argument, let's here it -- preferably without snide and unwarranted personal attacks, if you can manage to avoid sinking so low. Your useless rhetoric serves no purpose here if that is all you can manage. Why do you refuse to address the issues by use of such disinformation tactics (rule 18 - emotionalize, antagonize, and goad opponents)?'

(19) Ignore Proof Presented, Demand Impossible Proofs ~ This is perhaps a variant of the 'play dumb' rule. Regardless of what material may be presented by an opponent in public forums, claim the material irrelevant and demand proof that is impossible for the opponent to come by (it may exist, but not be at his disposal, or it may be something which is known to be safely destroyed or withheld, such as a murder weapon.) In order to completely avoid discussing issues, it may be required that you to categorically deny and be critical of media or books as valid sources, deny that witnesses are acceptable, or even deny that statements made by government or other authorities have any meaning or relevance.

Example: 'All he's done is to quote the liberal media and a bunch of witnesses who aren't qualified. Where's his proof? Show me wreckage from flight 800 that shows a missile hit it!'

Proper response: 'You are avoiding the issue with disinformation tactics. You presume for us not to accept Don Phillips, reporter for the Washington Post, Al Baker, Craig Gordon or Liam Pleven, reporters for Newsday, Matthew Purdy or Matthew L. Wald, Don Van Natta Jr., reporters for the New York Times, or Pat Milton, wire reporter for the Associated Press -- as being able to tell us anything useful about the facts in this matter. Neither would you allow us to accept Robert E. Francis, Vice Chairman of the NTSB, Joseph Cantamessa Jr., Special Agent In Charge of the New York Office of the F.B.I., Dr. Charles Wetli, Suffolk County Medical Examiner, the Pathologist examining the bodies, nor unnamed Navy divers, crash investigators, or other cited officials, including Boeing Aircraft representatives a part of the crash investigative team -- as a qualified party in this matter, and thus, dismisses this material out of hand. Good logic, -- about as good as saying 150 eye witnesses aren't qualified. Then you demand us to produce evidence which you know is not accessible to us, evidence held by FBI, whom we accuse of cover up. Thus, only YOU are qualified to tell us what to believe? Witnesses be damned? Radar tracks be damned? Satellite tracks be damned? Reporters be damned? Photographs be damned? Government statements be damned? Is there a pattern here?. Why do you refuse to address the issues by use of such disinformation tactics (rule 19 - ignore proof presented, demand impossible proofs)?'

(20) False Evidence ~ Whenever possible, introduce new facts or clues designed and manufactured to conflict with opponent presentations -- as useful tools to neutralize sensitive issues or impede resolution. This works best when the crime was designed with contingencies for the purpose, and the facts cannot be easily separated from the fabrications.

Example: Jack Ruby warned the Warren Commission that the white Russian separatists, the Solidarists, were involved in the assassination. This was a handy 'confession', since Jack and Earl were both on the same team in terms of the cover up, and since it is now known that Jack worked directly with CIA in the assassination (see below.)

Proper response: This one can be difficult to respond to unless you see it clearly, such as in the following example, where more is known today than earlier in time... 'You are avoiding the issue with disinformation tactics. Your information is known to have been designed to side track this issue. As revealed by CIA operative Marita Lorenz under oath offered in court in E. Howard Hunt vs. Liberty Lobby, CIA operatives E. Howard Hunt, James McCord, and others, met with Jack Ruby in Dallas the night before the assassination of JFK to distribute guns and money. Clearly, Ruby was a coconspirator whose 'Solidarist confession' was meant to sidetrack any serious investigation of the murder AWAY from CIA. Why do you refuse to address the issues by use of such disinformation tactics (rule 20 - false evidence)?'

(21) Call a Grand Jury, Special Prosecutor, or Other Empowered Investigative Body ~ Subvert the (process) to your benefit and effectively neutralize all sensitive issues without open discussion. Once convened, the evidence and testimony are required to be secret when properly handled. For instance, if you own the prosecuting attorney, it can insure a Grand Jury hears no useful evidence and that the evidence is sealed an unavailable to subsequent investigators. Once a favorable verdict is achieved, the matter can be considered officially closed. Usually, this technique is applied to find the guilty innocent, but it can also be used to obtain charges when seeking to frame a victim.

Example: According to one OK bombing Federal Grand Juror who violated the law to speak the truth, jurors were, contrary to law, denied the power of subpoena of witness of their choosing, denied the power of asking witnesses questions of their choosing, and relegated to hearing only evidence prosecution wished them to hear, evidence which clearly seemed fraudulent and intended to paint conclusions other than facts actually suggested.

Proper response: There is usually no adequate response to this tactic except to complain loudly at any sign of its application, particularly with respect to any possible cover up. This happened locally in Oklahoma, and as a result, a new Grand Jury has been called to rehear evidence that government officials knew in advance that the bombing was going to take place, and a number of new facts which indicate it was impossible for Timothy McVeigh to have done the deed without access to extremely advanced explosive devices such as available ONLY to the military or intelligence community, such as CIA's METC technology. Media has refused to cover the new Oklahoma Grand Jury process, by they way.

(22) Manufacture a New Truth ~ Create your own expert(s), group(s), author(s), leader(s) or influence existing ones willing to forge new ground via scientific, investigative, or social research or testimony which concludes favorably. In this way, if you must actually address issues, you can do so authoritatively.

Example: The False Memory Syndrome Foundation and American Family Foundation and American and Canadian Psychiatric Associations fall into this category, as their founding members and/or leadership include key persons associated with CIA Mind Control research. Read The Professional Paranoid or Phsychic Dictatorship in the U.S.A. by Alex Constantine for more information. Not so curious, then, that (in a perhaps oversimplified explanation here) these organizations focus on, by means of their own "research findings", that there is no such thing as Mind Control.

Proper response: Unless you are in a position to be well versed in the topic and know of the background and relationships involved in the opponent organization, you are not well equipped to fight this tactic.

(23) Create Bigger Distractions ~ If the above does not seem to be working to distract from sensitive issues, or to prevent unwanted media coverage of unstoppable events such as trials, create bigger news stories (or treat them as such) to distract the multitudes.

Example: To distract the public over the progress of a WTC bombing trial that seems to be uncovering nasty ties to the intelligence community, have an endless discussion of skaters whacking other skaters on the knee. To distract the public over the progress of the Waco trials that have the potential to reveal government sponsored murder, have an O.J. summer. To distract the public over an ever disintegrating McVeigh trial situation and the danger of exposing government involvements, come up with something else (Flight 800?) to talk about -- or, keeping in the sports theme, how about sports fans shooting referees and players during a game and the focusing on the whole gun control thing?

Proper response: The best you can do is attempt to keep public debate and interest in the true issues alive and point out that the 'news flap' or other evasive tactic serves the interests of your opponents.

(24) Silence Critics ~ If the above methods do not prevail, consider removing opponents from circulation by some definitive solution so that the need to address issues is removed entirely. This can be by their death, arrest and detention, blackmail or destruction of their character by release of blackmail information, or merely by destroying them financially, emotionally, or severely damaging their health.

Example: As experienced by certain proponents of friendly fire theories with respect to flight 800 -- send in FBI agents to intimidate and threaten that if they persisted further they would be subject to charges of aiding and abetting Iranian terrorists, of failing to register as a foreign agents, or any other trumped up charges. If this doesn't work, you can always plant drugs and bust them.

Proper response: You have three defensive alternatives if you think yourself potential victim of this ploy. One is to stand and fight regardless. Another is to create for yourself an insurance policy which will point to your opponents in the event of any unpleasantness, a matter which requires superior intelligence information on your opponents and great care in execution to avoid dangerous pitfalls (see The Professional Paranoid by this author for suggestions on how this might be done). The last alternative is to cave in or run (same thing.)

(25) Vanish ~ If you are a key holder of secrets or otherwise overly illuminated and you think the heat is getting too hot, to avoid the issues, vacate the kitchen.

Example: Do a Robert Vesco and retire to the Caribbean. If you don't, somebody in your organization may choose to vanish you the way of Vince Foster or Ron Brown.

Proper response: You will likely not have a means to attack this method, except to focus on the vanishing in hopes of uncovering it was by foul play or deceit as part of a deliberate cover up.



Eight Traits of the Disinformationalist
by H. Michael Sweeney <HMS@proparanoid.com>
copyright (c) 1997, 2000 All rights reserved

(1) Avoidance ~ They never actually discuss issues head-on or provide constructive input, generally avoiding citation of references or credentials. Rather, they merely imply this, that, and the other. Virtually everything about their presentation implies their authority and expert knowledge in the matter without any further justification for credibility.

(2) Selectivity ~ They tend to pick and choose opponents carefully, either applying the hit-and-run approach against mere commentators supportive of opponents, or focusing heavier attacks on key opponents who are known to directly address issues. Should a commentator become argumentative with any success, the focus will shift to include the commentator as well.

(3) Coincidental ~ They tend to surface suddenly and somewhat coincidentally with a new controversial topic with no clear prior record of participation in general discussions in the particular public arena involved. They likewise tend to vanish once the topic is no longer of general concern. They were likely directed or elected to be there for a reason, and vanish with the reason.

(4) Teamwork ~ They tend to operate in self-congratulatory and complementary packs or teams. Of course, this can happen naturally in any public forum, but there will likely be an ongoing pattern of frequent exchanges of this sort where professionals are involved. Sometimes one of the players will infiltrate the opponent camp to become a source for straw man or other tactics designed to dilute opponent presentation strength.

(5) Anti-Conspiratorial ~ They almost always have disdain for 'conspiracy theorists' and, usually, for those who in any way believe JFK was not killed by LHO. Ask yourself why, if they hold such disdain for conspiracy theorists, do they focus on defending a single topic discussed in a NG focusing on conspiracies? One might think they would either be trying to make fools of everyone on every topic, or simply ignore the group they hold in such disdain. Or, one might more rightly conclude they have an ulterior motive for their actions in going out of their way to focus as they do.

(6) Artificial Emotions ~  An odd kind of 'artificial' emotionalism and an unusually thick skin -- an ability to persevere and persist even in the face of overwhelming criticism and unacceptance. This likely stems from intelligence community training that, no matter how condemning the evidence, deny everything, and never become emotionally involved or reactive. The net result for a disinfo artist is that emotions can seem artificial. Most people, if responding in anger, for instance, will express their animosity throughout their rebuttal. But disinfo types usually have trouble maintaining the 'image' and are hot and cold with respect to pretended emotions and their usually more calm or unemotional communications style. It's just a job, and they often seem unable to 'act their role in character' as well in a communications medium as they might be able in a real face-to-face conversation/confrontation. You might have outright rage and indignation one moment, ho-hum the next, and more anger later -- an emotional yo-yo. With respect to being thick-skinned, no amount of criticism will deter them from doing their job, and they will generally continue their old disinfo patterns without any adjustments to criticisms of how obvious it is that they play that game -- where a more rational individual who truly cares what others think might seek to improve their communications style, substance, and so forth, or simply give up.

(7) Inconsistent ~ There is also a tendency to make mistakes which betray their true self/motives. This may stem from not really knowing their topic, or it may be somewhat 'freudian', so to speak, in that perhaps they really root for the side of truth deep within. I have noted that often, they will simply cite contradictory information which neutralizes itself and the author. For instance, one such player claimed to be a Navy pilot, but blamed his poor communicating skills (spelling, grammar, incoherent style) on having only a grade-school education. I'm not aware of too many Navy pilots who don't have a college degree. Another claimed no knowledge of a particular topic/situation but later claimed first-hand knowledge of it.

(8) Time Constant ~ Recently discovered, with respect to News Groups, is the response time factor. There are three ways this can be seen to work, especially when the government or other empowered player is involved in a cover up operation: (1) ANY NG posting by a targeted proponent for truth can result in an IMMEDIATE response. The government and other empowered players can afford to pay people to sit there and watch for an opportunity to do some damage. SINCE DISINFO IN A NG ONLY WORKS IF THE READER SEES IT - FAST RESPONSE IS CALLED FOR, or the visitor may be swayed towards truth. (2) When dealing in more direct ways with a disinformationalist, such as email, DELAY IS CALLED FOR - there will usually be a minimum of a 48-72 hour delay. This allows a sit-down team discussion on response strategy for best effect, and even enough time to 'get permission' or instruction from a formal chain of command. (3) In the NG example (1) above, it will often ALSO be seen that bigger guns are drawn and fired after the same 48-72 hours delay -- the team approach in play. This is especially true when the targeted truth seeker or their comments are considered more important with respect to potential to reveal truth. Thus, a serious truth sayer will be attacked twice for the same sin.

I close with the first paragraph of the introduction to my unpublished book, Fatal Rebirth:

Truth cannot live on a diet of secrets, withering within entangled lies. Freedom cannot live on a diet of lies, surrendering to the veil of oppression. The human spirit cannot live on a diet of oppression, becoming subservient in the end to the will of evil. God, as truth incarnate, will not long let stand a world devoted to such evil. Therefore, let us have the truth and freedom our spirits require... or let us die seeking these things, for without them, we shall surely and justly perish in an evil world.


References

(Source: http://www.propagandacritic.com)

(1) Chase, Stuart: Guides to Straight Thinking; New York: Harper and Brothers, 1956.
(2) Combs, James & Nimmo, Dan: The New Propaganda: The Dictatorship of Palavar in Contemporary Politics; New York: Longman Publishing Group, 1993.
(3) Doob, Leonard: Propaganda: Its Psychology and Technique; New York: Henry Holt and Company, 1935.
(4) Edwards, Violet: Group Leader's Guide to Propaganda Analysis; New York: Columbia University Press, 1938.
(5) Ellul, Jacques: Propaganda: The Formation of Men's Attitudes; New York: Vintage Books, 1965.
(6) Hummel, William & Huntress, Keith: The Analysis of Propaganda; New York: William Sloane Associates, 1949.
(7) Institute for Propaganda Analysis: Propaganda Analysis; New York: Columbia University Press, 1938.
(8) Institute for Propaganda Analysis: The Fine Art of Propaganda; New York: Harcourt, Brace and Company, 1939.
(9) Lee, Alfred McClung: How to Understand Propaganda; New York: Rinehart and Company, 1952.
(10) Lowenthal, Leo & Guterman, Norbert: Prophets of Deceit; 1949. Palo Alto: Pacific Books Publishers, 1970.
(11) Miller, Clyde: The Process of Persuasion; New York: Crown Publishers, 1946.
(12) Pratkanis, Anthony & Aronson, Elliot: Age of Propaganda: The Everyday Use and Abuse of Persuasion; New York: W.H. Freeman and Company, 1991.
(13) Rank, Hugh: Language and Public Policy; New York: Citation Press, 1974.
(14) Thum, Gladys & Thum, Marcella: The Persuaders: Propaganda in War and Peace; New York: Atheneum, 1972.




Your Support Maintains this Service --

BUY

The Rex Research Civilization Kit

... It's Your Best Bet & Investment in Sustainable Humanity on Earth ...
Ensure & Enhance Your Survival & Genome Transmission ...
Everything @ rexresearch.com on a Thumb Drive !
ORDER PAGE