WW1 deception #2
Description
Please reply to this post by responding to the following:
Identify three notable examples of WWI deception from different chapters of the Rankin book. Identify the type of information operation type (from Week 3) represented by each example. Next, identify a psychological error or bias (from Weeks 1 or 2) that might explain why each example worked. Support your arguments using specific course material.
Course Material:
Week 1/2:
Definitions
As with past week’s there are definitions to be derived; however, since some these are plainly laid out in the readings, they’ll only be discussed briefly here. Specifically, these would be MILDEC and MISO/PSYOP. There is another overarching term that will be added here — information operations. This has been added, because you’ll see this term used in many military publications that encompass aspects of our discussion. However, let’s first consider a term used in this course that you won’t hear much about. Nevertheless, it’s an important part of deception. The term is disinformation.
Disinformation
Disinformation is a term found in the name of this course, but not a term you’ll find in this week’s readings. Disinformation sounds like an English word. It does have its roots there, but it came back to English after editing by the Soviets who created the Russian term дезинформация dezinformatsiya. Disinformation is best thought of as a specific form of false or inaccurate information that has been spread to achieve a political end (Pacepa and Rychlak 2013). In discussing the Soviet use of disinformation, Pacepa and Rychlak (2013) explain that “Elsewhere in the world, foreign intelligence services are primarily engaged in collecting information to help their heads of state conduct foreign affairs, but in Russia and later throughout the Russian sphere of influence, that task has always been more or less relevant. There the goal is to manipulate the future, not just to learn about the past. Specifically, the idea is to fabricate a new past for enemy targets in order to change how the world perceives them” (5).
Don’t confuse this with misinformation, which is material that is unintentionally incorrect. In military usage, this may involve the communication of information that is designed to redirect enemy efforts as in Operation Mincemeat during WWII. This is where there tends to be overlap with the concept of deception. However, this term is often used differently so be wary.
It’s an ill-defined term that is often applied widely to describe false or deceptive efforts. For example, Country X engaged in a disinformation campaign to discredit its leading critique. For historical references, one might look to the case of the United States’ attack on Jacobo Arbenz or the Soviet attack on Leon Trotsky. Both countries initially sought to discredit their respective targets through rumor campaigns, falsified news reports, legal efforts, diplomatic efforts and other means. In the case of Arbenz, the effort worked. In the case of Trotsky, who advocated a form of Communism counter to Stalinism, these efforts, that included widely televised show trials, failed. This led to several assassination attempts. Finally, an ice axe was deemed successful.
Fundamentally, disinformation is intentionally false and spread intentionally. This is in contrast to a related term — misinformation — that refers to the accidental spread of false or inaccurate information. Normally, both cane be involved deception efforts. For example, an agent may intentionally promote false information (disinformation) in order to turn opinion against a target. Those who receive such information and believe it to be true might very well pass it on. If they do, they are sharing misinformation, because they don’t know that it isn’t true. Because of the convoluted nature of these related ideas, the term disinformation doesn’t appear in the lesson readings nor will it be discussed in any substantive way.
Propaganda
We began the discussion of propaganda in the last lesson, but there is more to that discussion. First, it’s important to remember for the purposes of this class propaganda involves a (state or proto-state) government transmitting a message, though the message outlets may be non-governmental (Morris 2016). Originally, propaganda had a positive connotation, but this changed most dramatically after Allied propaganda efforts successfully attacked the propaganda efforts of the Axis, especially that of the Nazis. The United States was especially successful in executing an information campaign against the Nazi Bureau of Propaganda that gave propaganda a bad name. Until then, it was held as a neutral term related to communicating ideas and attitudes. Many countries had some official entity known as the propaganda arm. The American effort was more subtle. For example, during World War II the United States had several propaganda arms with the US Office of War Information being one of the most effective. Though the psychological techniques may be the same, the difference between propaganda and other forms of influence (public relations, marketing, advertising, political campaigning, etc.) is the source.
Jowett and O’Donnell define propaganda as, “a form of communication that attempts to achieve a response that furthers the desired intent of the propagandist” (2012, 1). Of course, this isn’t very helpful. That’s like saying that persuasive writing is used by an author to persuade. Fortunately, Jowett and O’Donnell build on this later in the text. They add this: “propaganda is the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist” (2012, 7).
One addition that goes beyond the bounds of specificity found in our readings involves efforts by political entities. Though these most commonly may be thought of as governments, it would also include other organizations such as al Qaeda, Sinn Fein, and others. Most of these have political agendas that involve the creation or change of political structures that would allow them to create a state in their own image, thus my use of the term proto-state in the forum this week. By focusing our attention on propaganda as a product of states and proto-states, we stay more consistent with the modern application of this term in much of the literature and practice. Further, it helps keep us from muddling issues when considering techniques that are found in a wide range of influence and deception efforts from marketing, advertising, public relations, political campaigns, intimate relationships, etc., etc., etc. A former student added to this discussion with the following statement .
Propaganda takes a variety of forms with white, gray, and black. White propaganda makes no effort to hide the source. A common form of this type of propaganda can be found in public diplomacy efforts conducted by the U.S. State Department. The information is normally verifiable and perceived as true by its originators. Gray propaganda tends to obscure the source but doesn’t purport to be from someone else. It also tends to be verifiable and perceived as true by its originators, though the source is obscured to increase receptivity. Black propaganda promotes a message that supports political aims but does so by attributing the material to sources other than the originator. This form of propaganda often involves falsehoods, innuendo, and other material that appeals to the recipients.
Propaganda Continued…
Though your readings don’t discuss it, there is a role known as counter-propaganda. If one understands propaganda as “the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that further the desired intent of the propaganda” (Jowett and O’Donnell 2012, 7) then the counter is relatively simple. It would involve thwarting such efforts. How that might be depend upon the nature of the information operations being waged. Nevertheless, certainly elements remain true in most situations. Whatever counter-propaganda efforts may be mounted, it normally relies upon clearly and quickly conveying understandable and true information that is suitable for the targeted audience. Successful counterpropaganda unmasks the actual source behind gray and black propaganda as well as any falsehoods found in said propaganda products.
Remember that as one looks at information operations (IO), it will not always be clear where propaganda and counter-propaganda efforts fall. Many of the functions fall within the realm of MISO, but because not all propaganda/counter-propaganda efforts are solely military (no matter what the military doctrine says), these efforts will extend to other agencies as well.
Information Operations
This term is commonly applied by the U.S. military to describe a wide range of offensive and defensive functions. These are employed within the physical, informational, and cognitive domain to achieve national objectives. These operations and the underlying concepts are developed within JP 3-13, Information Operations. That publication defines IO “as the integrated employment of electronic warfare (EW), computer network operations (CNO), psychological operations (PSYOP), military deception (MILDEC), and operations security (OPSEC), in concert with specified supporting and related capabilities, to influence, disrupt, corrupt or usurp adversarial human and automated decision making while protecting our own” (DOD 2014, GL-3).
MISO/PSYOP MILDEC
The first acronym recognizes the new name (Military Information Support Operations (MISO)), politically inspired category of influence — both deceptive and informational — that has long been known as psychological operations (PSYOP). Obviously, if a government entity other than the military employed such methodologies using other than military assets, it would not be called MISO. There are a number of names and euphemisms that may be applied in such cases, but they are not necessary in understanding the principles.
Introduction
It’s a fast moving class for professionals who want to add to their skill sets or hone existing skills. This is indeed a murky arena of thinking and practice; however, it is not unfathomable. It does tend to challenge many who are comfortable with the more routinized forms of intelligence operations.
Here is a koan that points to the way we’ll likely wrestle with some ideas in this class.
Not the Wind, Not the Flag:
Two monks were arguing about a flag.
One said: `The flag is moving.’
The other said: `The wind is moving.’
The sixth patriarch happened to be passing by.
He told them: `Not the wind, not the flag; the mind is moving.’
Mumon’s Comment: The sixth patriarch said: `The wind is not moving, the flag is not moving. Mind is moving.’ What did he mean? If you understand this intimately, you will see the two monks there trying to buy iron and gaining gold. The sixth patriarch could not bear to see those two dullards, so he made such a bargain (Zen@Metalab n.d.).
Wind, flag, mind moves.
The same understanding.
When the mouth opens
All are wrong.
Much of what goes on in deception, propaganda, and disinformation shares in the thought behind this koan. There appear to be many things going on. Many look and see the flag — the obvious. Others sense the wind, though nothing can be seen. Some look beyond to other factors. Some look to the mind, but all can change with the output of the mouth (communication) which shapes new meaning, new activity.
Today, discussions of these topics are viewed by many as repugnant or worse. Perhaps you had that feeling when you saw the picture of or read the quote from Adolph Hitler. Throughout much of history, terms such as propaganda didn’t have negative connotations. That is a recent phenomenon that largely came about because of counter-propaganda efforts by the Americans and British. However, even if that’s enough to dissuade you, consider this. These practices are used routinely by many agencies and people — even private companies make use of persuasion practices. If you’re to learn how to thwart the denial and deception efforts of others, you will need to understand the principles and practices behind them. This is the primary reason for the course.
For a professional to improve his/her ability to cut through the denial and deception inherent to these practices, a greater understanding of psychology, sociology, communication, culture, history, and more is needed. Further, the analyst needs tools that help order “habits of the mind” and limit inherent bias that enables successful deception. That’s the focus of this lesson and the entire course.
Persuasion and its Many Aspects
We begin this week by establishing fundamental definitions, concepts, and related practices. For that reason, this is one of the most fundamental of all the lessons in this course. It’s important to remember that this course, despite elements that are very personal or individual in nature, is to be viewed from the perspective of a state (country) achieving national objectives. Thus, the tools provided in the course should be viewed from that perspective to achieve the best understanding.
Within this material, it’s important to remember that there is also a hierarchy of concepts. Influence is the broadest of ideas and covers all that we talk about. Yet, it’s so broad that it’s not very helpful for much of what we’re doing in this course.
Hierarchy of Persuasion
It’s important to recognize that many things are related to persuasion but not all of them are inherently deceptive in nature. This lesson explains elements of the first and second tier as well as related psychological concepts. In this discussion, a number of terms will be defined in order to reduce the confusion caused by the inaccurate use in the common vernacular. This will help make the discussion in the class more productive, because confusion should be reduced as we examine aspects of the third tier.
1ST TIER
2ND TIER
3RD TIER
Definition of Key Terms
PERSUASION INFLUENCE DECEPTION
This section discusses the nature of influence and the subordinate concepts of deception and information. It’s important to remember that many of the concepts discussed in this class look at human psychology and the ways it can be exploited or protected. In the case of deception, the exploitation normally uses “lies” of commission or omission. In other words, the deception is done through presenting other information as truth or by simply leaving information out. Yet, this arena can be a murky one with frequent overlap. To help increase your ability to work through this domain, let’s first look at the concept of influence.
Key Psychological Functions
This lesson looks at two specific types of functions that are known to cause cognitive errors. The first are biases. The second are heuristics. The term bias or biases is likely well known to you. One may be biased against a category of people, a way of doing something, or a specific thing; however, unlike the media promoted ideas of bias, biases are not always negative. For example, there’s nothing wrong if you prefer rice rather than potatoes or westerns over mysteries. Nevertheless, biases will lead individuals to make decisions that by necessity leave out, ignore, or alter thinking in ways that might not otherwise occur. When considering intelligence collection, analysis, targeting, etc., biases may remove viable, even critical, targets unduly.
The second area involves heuristics. Heuristics are mental algorithms that speed thinking by focusing attention and streamlining analysis. For example, if you were asked whether you’d like fish or fowl for dinner, it’s unlikely that you’d go through the entire list of fish or birds that you know. For example, most people already have in mind what fowl they might eat. In the United States, that list would likely include chicken, turkey, and duck. More exotic eaters might include grouse, partridge, squab (young pigeon), etc. Nevertheless, it’s virtually impossible to find someone who included hummingbirds, ostriches, penguins, egrets, golden eagles, etc. when considering what to eat for dinner. That limiting process is the result of a heuristic that tells your brain there are limited variables to be considered. Yet, while this may help in making dinner decisions, it may impede efforts to determine what an opponent might do in a real-world, threat situation.
Key Psychological Functions
Arrow pointing down
Confirmation Bias
Confirmation bias refers to the tendency for people to select those elements that support their pre-conceived notions. Depending on how the information is handled it may also be called the Texas Sharpshooter Fallacy, cherry picking, “myside” bias, confirmatory bias, etc. Regardless of the name, the process involves mental efforts to eliminate any competing ideas and focus on those data points that support one’s case. One simply draws the proverbial bullseye around those data points that fit preconceived ideas.
Elections are an excellent place to look for confirmation bias. If you love Candidate Smith but despise Candidate Jones, you’ll look for information that supports your candidate and denigrates his/her opponent. Further, when things become too troubling, you might find yourself coloring that information to fit your biases. For example, if Candidate Jones barely squeaks out of criminal charges after an investigation, you might trumpet the vindication of your Jones. You might even trumpet Jones’ innocence and the abuse of power by those doing the investigation, even if he/she had many examples of troubling or questionable behavior come to light in that investigation. Conversely, you’d delight to see Jones’ being called out for kicking his/her neighbor’s dog. For you, this might be taken as clear evidence of how evil Jones really is. The bottom line is simple. Humans seek to be right. Thus, they’ll look for evidence to support their views, even if that means ignoring clear evidence to the contrary.
One of the easiest examples of confirmation bias to visualize is the Texas Sharpshooter Fallacy. Imagine a less than stellar shooter firing at the side of a barn. After firing all his/her rounds, the next step is to see how accurate they were. Now imagine the shooter circling the biggest group of shots with a bullseye and then putting concentric circles out from there. Success! Of course, not starting with the bullseye brings the shooter’s accuracy into question. Yet this is often how people approach analysis of issues. Begin with existing biases, desired outcomes, etc. Then find ways to pull them together into a credible package to show to others.
In the realm of intelligence, it’s easy to see where this could influence one’s analysis or actions. When someone “knows” the bad guy and what the bad guy will do, such an individual will be looking for confirmation of what is already known. That’s true for many people, even when there is clear evidence to the contrary. People like to know they’re right, and they like things that are easy. Viola! An answer with little analysis, based on assumptions. One such example can be seen in the Vietnam War. Many U.S. analysts looked at the North Vietnamese military leader General Vo Nguyen Giap through the lens of their training. As an Asian Communist, they “knew” that he must have borrowed his warfare theories from the Chinese Communist leader Mao Zedong. After the war, corrections were necessary. In fact, the underlying animosity between Vietnam and China as well as Giap’s European education meant that he favored Clausewitz and Jomini for his insights into conduction warfare. By the way, these were the two most predominant influences in the doctrine of U.S. land force doctrine and practice!
One of the best counters to this problem involves establishing criteria for analysis beforehand. Such criteria must be written out and available to others who review the final work. This increases the odds that both the originator and reviewer might catch problems in the analysis. Without established analytical frameworks and measures, it’s virtually impossible to avoid some degree of confirmation bias.
RELATED READING
Normalcy Bias
This problem tends to be most evident in high stress or crisis situations. Though there is often discussion of the “fight or flight” reflex in which humans either confront or actively avoid a problem or conflict, normalcy bias is the less discussed “hide” reflex. Many creatures exhibit this behavior. It can be a life-saving process when a creature relies upon its natural camouflage or superior position to avoid detection by predators. Unfortunately, this doesn’t help individuals trapped in a fire or other situations wherein the threat will overtake them.
The type of behavior caused by normalcy bias has often been attributed to injury and death in humans. For example, in aircraft fires on the ground, many survivors report other unharmed passengers sitting motionless in their seats or going about normal tasks, i.e. collecting their things, that were inappropriate for the situation. Seldom were these people among the survivors unless others interviewed. Mentally the situation overwhelmed their ability to process because they had never experienced or even considered such a catastrophic situation. Survivors from such accidents typically came from three groups. The first group had either received prior training and/or had considered the possibility beforehand and prepared a plan (Remember those boring pre-flight briefings by the flight crew?). The second group had been helped out of the burning craft by members of the first group or outside rescuers. The third and final group might be considered the “blind luck” group, because they were often ones who had fallen out, been blown out, or otherwise been removed from the situation through no effort of their own or others. Of course, normalcy bias doesn’t just come into play for aircraft accidents. It’s seen in many human interactions.
Open conflict such as fights, arrests, and combat also tend to trigger normalcy bias. If you’ve ever been in any of these situations, you know the responses triggered by your body and the way time awareness changes. You probably also realize how important your training and experience were in moving you through the process. For those who lack such experience, trust the rest of us! The human brain must process new situations, but not all situations are conducive to on-the-job learning. There are many combat systems that use a color scheme to represent this process. For example, military and police often use one that visualizes green as normal conditions, yellow as high alert status, and red as active threat/crisis. That’s simple enough, because similar concepts are seen elsewhere. However, it’s the final stage that is tied to normalcy bias — black. In crisis, people can easily go from green or yellow to black — the stage in which the mind shuts down or slows so dramatically that meaningful action is no longer possible. It’s at the black stage of mental processing in which normalcy bias is at its worst. It’s here that an opponent might capture, wound or kill you while your mind is processing options or trying to focus on those things you’re accustomed to for a lack of any decision making.
Normalcy bias can affect those in intelligence in a number of ways. Because humans seek to establish a normal state, any change to that can cause mental roadblocks to analysis. It may be as simple as slowing the process or as bad as “locking up” someone’s mental functions for a period of time. This can happen even outside of direct combat.
As has already been discussed, one of the best counters to normalcy bias is experience. This may come from actual experience or experience gained in training. This may also be “borrowed” from others by the use of simple devices like checklists — mental or actual. This is why you see most military organizations and some aspects of the intelligence community having checklists at hand. When the feces hit the proverbial rotating blades, it’s not time to start thinking from scratch. Even in analysis, one might run a checklist, analysis form or some other device that moves one step by step through the needed analysis action items.
Another device for overcoming the problems associated with normalcy bias is the practice of running worst case analysis or planning. If one considers what might happen next, that individual will be more likely to respond effectively if conditions change. This is somewhat true, even when the conditions don’t change exactly as predicted. As noted earlier, a good example of this can be seen in the testimonies of those in aircraft accidents. Many of those who survive often attribute their actions to thinking about the worst-case possibilities and the necessary actions to respond. Many of the remaining survivors attribute their survival to being pushed, pulled or otherwise directed by those who had planned ahead.
RELATED READING
Framing
When a person looks at a problem, it is seldom with fresh eyes (without bias). Not surprisingly, the new problem is viewed through the lens of past experience. That experience is linked to the current context of the problem and helps put the “frame” around what will be examined and how it will be viewed. As a rule, younger people have fewer points of reference to draw on and thus may be more flexible in their processing. As one ages, as long as cognitive faculties remain intact, one’s increased number of points of reference can increase analytic speed while also providing less biased results (Erber 2010; Peters, Finucane, MacGregor, and Slovic 2000). Although age alone isn’t sufficient for this, individuals must have training and life experience to draw on. Older or younger individuals without cognitive resources from training and experience are more likely to use emotional frames of reference (Watanabe and Shibutani 2010).
Because framing leads individuals to apply past practices and ideas, it is often linked to or used synonymously with agenda setting. Often the term agenda setting is used in the context of past media messaging that set the “agenda” for one’s thinking and actions. In this, there is normal a prioritization to what is to be accepted and what is to be rejected (McCombs and Shaw 1972).
One of the ways intelligence professionals have found to reduce the effects of framing has been to employ techniques like Red Team or Red Hat exercises that place them in the role of an opponent or other actor. When one is forced to think like someone else, it can easily highlight the problems that come from applying one’s own experience to that of others. The more distant the culture between analyst and target, the more necessary such efforts are to eliminate errors caused by framing.
RELATED READING
Priming
Priming is related but notably different from framing. Humans respond to their environment both physically and mentally. One of the most common aspects of mental engagement is called priming. Priming helps the mind focus on a specific schema (the way humans order/categorize the world, i.e. all cats purr and have tails, all chairs have four legs, etc.). Despite debate about how it works, there is clear evidence that humans tend to use the most immediate schema created by recent stimuli. It may be immediate because the person in question has just heard, seen or experienced something related. Nevertheless, the resulting mental activity is implicit, meaning it is not consciously recognized or processed.
A classic example might be seen in the famous Alfred Hitchcock movie Psycho. This movie includes one of the most famous horror scenes in western cinema. A shadowy figure with a knife attacks and kills a young woman in a hotel shower. Viewers continue to report an increased fear of attack while bathing/showering after viewing this scene. If you were one of those, you might consider actions you took — lock the door(s), check the window(s), consider routes of escape, consider means of defense, etc. If you did anything like this while still using your regular bathroom, the only thing that changed was awareness (priming) provided by said movie scene.
A common example of this becomes evident to many people when they make a major purchase. For example, when a person buys a new car, he or she may suddenly see the same type of car “everywhere”. They existed before the purchase, but there was no reason to focus on their presence prior to the purchase. Now they seem frequent and the purchaser begins to construct ideas about who buys them, why they buy them, etc. Both positive and negative priming that uses the most recent, relevant information by which to interpret events in the current environment. In the first, one uses negative priming cues about what bad things would happen in a situation like this. Notably, negative priming can slow mental processing. Conversely, positive priming can help speed processing time. In the second, the new buyer suddenly sees things that never evoked awareness before. With this new awareness, he/she might more quickly recognize more similar vehicles.
As might be seen from these examples, the priming effect tends to be shorter than the effects of framing (Rokos-Ewoldsen, Rokos-Ewoldsen, and Carpenter 2009). The more recent and intense primes will create stronger effects (Rokos-Ewoldsen, et al. 2009). Not surprisingly, once one is primed with specific concepts and the conditions influence one’s emotions, attitude and behavior changes will normally follow quickly and without conscious thought.
Though priming, like framing, tends to highlight or make salient a specific point, it doesn’t tend to provide specific evaluative or prioritizing suggestions like framing (Scheufele and Tewksbury 2007). Once opinions are set, there is a tendency for individuals to seek information that is consistent with their views. This can be present in both framing and priming; however, in priming this information tends to fit existing evaluative measures rather than providing the evaluative measures as in framing. The effects of priming in this way have been seen to affect evaluations of politicians (Iyengar and Kinder 1987; Sheafer and Weimann 2005; and Moy, Xenos, and Hess 2006). The way other genders, races, classes, etc. are perceived (Hansen and Hansen 1988; Oliver, Ramasubramanian, and Kim 2007). From these primes, people construct mental models to better understand the situation and in preparation for future events (Wyer 2004; Johnson-Laird 1983; Rokos-Ewoldsen, et al. 2009; Wyer and Radvansky 1999).
Analysts are constantly influenced in ways that might not be evident. For example, the subtle facial gestures, body gestures, change in tone or word use by managers and commanders may trigger thinking that is less than optimal for good analysis. Though leaders sometimes make it clear what they want an analyst to find, it’s often more subtle. Yet the human mind has been wired to detect and act on these cues. Studies show support for this. For example, using certain words triggered changes in audience response (Drane and Greewald 1998).
An essential counter to the problem of priming is self-awareness. Where are your blind spots, problem areas, personal biases, etc.? These are areas that priming is more likely to pass undetected. It’s impossible to deflect every prime, given the mass of information inputs in any given day; however, one can minimize the effect by increased analysis of inputs. The more emotionally laden the input, the more care is needed to analyze it.
RELATED READING
Availability Heuristic
Some estimates put the number of decisions made by the average American at more than 50,000 a day! Thus, it’s not surprising that the brain creates shortcuts to reduce the overall processing burden (Tversky and Kahneman 1974). Thus, mundane activities may be categorized or analyzed in ways that are not necessarily accurate. For example, in one study test subjects were asked to list six reasons they might consider themselves assertive. Subjects in another group were asked to list 12 reasons. Not surprisingly, more of the subjects asked to list six were able to complete all or most of the list. In contrast, those asked for 12 reasons did not complete their list. When both groups were asked how assertive they felt, the six reason group scored themselves higher. Evidence suggested they did this because they believed they had a more complete data set, even though those in the 12 group often hard more than six reasons to support their assertiveness.
Often those who use this technique to influence others will use more vivid or emotional content in their communications. They will also repeat key elements more. The result is the message is more firmly lodged in the targets brain. A classic example of this in interpersonal communication involves the communication of a “hunk” or “babe” with a target that “wasn’t in their league.” A touch, a wink, and some suggestions, might lead the target to “decide for himself/herself” to do exactly what is being suggested. Of course, there are many other examples, but the key is the impact of highly visual, emotive, and/or repetitive language from others.
In a professional setting, it is common for one to act on their more immediate recall. The easier it is to recall the potential benefit or penalty for an action or non-action (often reinforced by visual, emotive, or repetitious elements) will drive the decision-making process. Frequency may not always come from a single event. It can be assumed from the mind’s attempts to link seemingly related events of sufficient immediacy and impact (Tversky and Kahneman 1973).
Countering the problems created by the Availability Heuristic calls for actions found in dealing with framing and priming. One must know themselves and ask questions about what is being decided, how it’s being decided, why it’s being decided, etc. Consider who might have influenced you in the process. Was there a push from leadership or a friend?
RELATED READING
Anchoring
Framing draws on existing knowledge. Priming relates to recent stimuli, but anchoring relates to your very first impression. This is the tendency for humans to fixate on the first thing they see or hear. Those selling you things rely on this heavily. Ever seen the price tag that’s been “slashed” to give you deep discounts? That great Item X was $975 but now you can get it for only $375! Wow! How could you turn it down? Anchoring is at the heart of negotiations too. The first one to announce the price or other negotiating point has set the anchor point. Trained negotiators know how to work around this, but most people just stick to that anchor when they offer their counter-argument, if any. Yet, it doesn’t occur only in sales and negotiations. Here’s an example from David McRany’s You Are Not So Smart (2012).
Answer this: Is the population of Uzbekistan greater or fewer than 12 million? Go ahead and guess.
OK, another question, how many people do you think live in Uzbekistan? Come up with a figure and keep it in your head. We’ll come back to this in a few paragraphs (MacRany 2012, 215).
Humans are given things to consider every day. Analysts are no exception; however, these considerations are never made in isolation. Consider a situation in which you begin the day with a briefing from your intel manager. The emphasis of the briefing involves a new problem with Terror Organization X. You’re given information that shows some of this activity is taking place in your area of responsibility. What’s your likely tendency? Your leadership is interested in you finding something. Of course, you’re a professional and want to succeed. Thus, you have both organizational and personal motivations to start from this anchor point. This will help you look within a specific range for things. This could lead you to find or miss things that aren’t specifically connected.
Back to Uzbekistan. The populations of Central Asian states probably aren’t numbers you have memorized. You need some sort of cue, a point of reference. You searched your mental assets for something of value concerning Uzbekistan — the terrain, the language, Borat — but the population figures aren’t in your head. What is in your head is the figure I gave, 12 million, and it’s right there up front. When you have nothing else to go on, you fixate on the information at hand. The population of Uzbekistan is about 28 million people. How far away was your answer? If you are like most people, you assumed something much lower. You probably thought it was more than 12 million but less than 28 million.
You depend on anchoring every day to predict the outcome of events, to estimate how much time something will take or how much money something will cost. When you need to choose between options, or estimate a value, you need tooting to stand on. How much should your electricity bill be each month? What is a good price for rent in this neighborhood? You need an anchor from which to compare, and when someone is trying to sell you something, that salesperson is more than happy to provide one. The problem is, even when you know this, you can’t ignore it (MacRany 2012, 215).
In the realm of analysis, there are a number of counters to this problem. The key is to use techniques that evaluate your answers. Humans are too quick to accept their own answers, because they lack insight or are lazy or are arrogant or for many other reasons. Thus, one must use devices to constructively question their decision-making processes and final decision validity. Some recommended tools from the intelligence community include processes like Diagnostic Reasoning, Analysis of Competing Hypotheses, and Argument Mapping. This is not an exhaustive list though. There are many techniques and practices than can help in this area. Sometimes something as simple as getting a disinterested party to evaluate your conclusion can help in a pinch.
RELATED READING
Week 3/4:
Introduction
Deception in war time is a time-honored practiced, but the advent of World War I forced rapid adaptation of new forms of deception that largely went beyond the tricks and tactics common to warfare. Technology demanded a shift from deception as an individual art — think the ghillie suit of the hunter turned sniper — to a national, industrial endeavor — think of hiding towns and fleets of ships.
In this lesson, we will examine the rise of deception science as well as the extension of art in support of military and diplomatic goals of World War I. There were innovations in vehicles: Q ships, dazzle paint, camouflage for all types of military equipment. There were innovations for individuals with standardization of uniform camouflage, deceptive protective devices (helmets, sniper stands, etc.). Further, policies changed, sometimes despite international law, such as the way German submarine crews responded to enemy shipping. Many ancient techniques were also dusted off and improved with greater emphasis on coordination of effort, i.e. the British haversack ruse in the Palestine campaign or the misdirection of target at places like Gallipoli.
Camouflage (Core Component of Military Deception)
World War I marked the point at which deception transitioned from art alone to deception science. Industrialization, and the associated advances in academic disciplines, opened the way to more systematized research and application of deception methods. In WWI, we see many cases of mixing the two. As noted in the readings, artists often drove things for the British, but the application fell to the sciences of industrialization to manufacture and to apply. Conversely, some sciences were applied by artists. This is especially true of camouflage.
In fact, the term camouflage was birthed in WWI, as you’ll see in the assigned readings this week. Though some camouflage practices predate this conflict, they often required lengthy explanation for the uninitiated. After WWI, the term camouflage was widely recognized and understood. The word camouflage is believed to come from camoufler, which was “Parisian slang mean ‘to disguise'” and was first seen in 1917 (Online Etymology Dictionary 2016a, np).
Nicholas Rankin, a well-known author of deception and camouflage topics, points to the evolutionary nature of camouflage in nature and sees a similar process being spurred by the rapid technological innovations of World War I. In addition to the responses to machine guns, artillery, and other weapons, he sees aircraft as the most fundamental game changer of the war. “It’s the evolution of flight that makes camouflage important, because they’re … like birds of prey that can see creatures on the ground. The creatures on the ground must become like moles and voles…Camouflage is about hiding and concealing and deceiving the predator. What you want the enemy to see is not where you really where you are but where you aren’t” (Rankin 2008, n.p.).
Of course the British were not the only ones schooled in deception. The following article highlights the French building of a fake city to deceive German pilots and artillery spotters. As you’ll find in this week’s readings, much of these efforts initially relied more on artists than on engineers and scientists.
Propaganda
Propaganda predates WWI by centuries. In fact, the first reference we know of its use comes in the early 1600s from a specific function within the Catholic church — Sacra Congregatiode Propaganda Fide (Sacred Congregation for the Propagation of the Faith) — which sought to “propagate the faith” or in other words make converts (Online Etymology Dictionary 2016b, np). Propaganda comes from the word propagare which comes from a farmer’s efforts in cutting back vines to get to the best fruit. You may see where this might lead with the psychological messaging and wordsmithing that would follow. This term became increasingly more politicized and linked to countries with the coming of WWI. Until then propaganda was considered the selective use of communication to promote a country’s interests through the use of persuasion, often by diplomatic communications.
WWI quickly integrated new technology from faster, more capable printing presses to radio and film. Here to these processes were in their infancy, but the ground was being set for more rapid advances in coming years. Here’s an excellent summary piece that describes general practices found in British and German propaganda during WWI.
Effective deception requires forethought, innovation, rapid adaptability, and coordination to achieve utmost effectiveness. None of this was evident at the onset of WWI. However, by the end of war, there had been great advances in the first three. The last element — coordination — was still in its infancy, but by World War II there had been significant leaps forward.
As you read this week, look for connections to earlier classes. Where do you see effective exploitation of the biases and heuristics noted in Lesson One? Do you see early application of propaganda principles in Lesson Two? What about specific information operations elements as discussed in Lesson Three?
Key Stages to “Soften up” Public Opinion
Though it was written long after WWI, Rune Ottosen, journalism professor and advocate of “peace journalism”, summarizes “several key stages of a military campaign to ‘soften up’ public opinion through the media in preparation for an armed intervention” (The Peace Journalism Option 2000, n.p.). These include the following stages:
THE PRELIMINARY STAGE
THE JUSTIFICATION STAGE
THE IMPLEMENTATION STAGE
THE AFTERMATH
Another presenter, ’Kane, notes “there is always a dead baby story” (The Peace Journalism Option 2000, n.p.) and it comes at the key point of the Justification Stage—in the form of a story whose apparent urgency brooks no delay—specifically, no time for cool deliberation or negotiating on peace proposals. Human interest stories … are ideal for engendering this atmosphere.
This is consistent with the work of Phillip Knightley which offers further delineation of Ottosen’s stages:
THE CRISIS
THE DEMONISATION OF THE ENEMY’S LEADER
THE DEMONISATION OF THE ENEMY AS INDIVIDUALS
ATROCITIES
Key Stages to “Soften up” Public Opinion Continued…
Efforts like these helped rally the population and focus its hatred against the primary enemy — the Germans. The next step was to tell stories in support of hiding valuable targets and exposing decoys in order to further the war effort. When combined with effective storytelling efforts and other information operation efforts like OPSEC, these deception efforts were seen to be far more successful. Of course, this story telling often emphasized incomplete or misdirected reporting. The intentional inaccuracies were necessary to keep enemy attention away from key efforts. On the ground it might be a feint in one area to focus the enemy’s attention away from the primary thrust of an attack. Camouflage was ultimately seen to be an important element in “selling” the related story. Failure to apply such efforts often taught lessons through failure. The American Expeditionary Force found this out the hard way (Parkinson 2012).
References
Cooke, Sonia van Gilder. 2011. “Bizarre: France Built a Decoy Paris to Fool German Bombers During World War I.” Time. November 11. Accessed August 17, 2016 http://newsfeed.time.com/2011/11/11/france-built-a-decoy-paris-to-fool-german-bombers-during-world-war-i/.
No Last Name, Josh. 2008. “The Battle for the Mind: German and British Propaganda in the First World War.” April 25. Accessed August. 17, 2016 https://quadri.wordpress.com/2008/04/25/the-battle-for-the-mind-german-and-british-propaganda-in-the-first-world-war/.
Online Etymology Dictionary. 2016a. “Camouflage.” http://www.etymonline.com/index.php?term=camouflage
Online Etymology Dictionary. 2016b. “Propaganda.” http://www.etymonline.com/index.php?term=camouflage
Parkinson, E. Malcolm. 2012. The Artists at War: Painters, Muralists, Architects Worked to Provide Camouflage for Troops in World War I. Prologue Magazine, 44, no. 1 (Spring). Accessed November 18, 2016. https://www.archives.gov/publications/prologue/2012/spring/camouflage.html
Rankin, Nicholas. 2008. “Nicholas Rankin on the History of Camouflage.” YouTube. Accessed August 11, 2016 https://www.youtube.com/watch?v=z_MfYlqDodY.
Shah, Anup. 2005. War, Propaganda and the Media. Accessed Nov. 18, 2016 http://www.globalissues.org/article/157/war-propaganda-and-the-media.
The Peace Journalism Option. 2000. Accessed November 22, 2016. http://web.archive.org/web/20000822111932/www.poiesis.org/pjo/pjotext.html
Image Citations
Everything for the Front, USSR WWII propaganda poster – Public Domain
“Britain’s Sea Power Is Yours! Poster” by http://www.iwm.org.uk/collections/item/object/4976.
“HMS Tamarisk, British First World War Q-ship” by https://commons.wikimedia.org/wiki/File:HMS_Tamarisk.jpg.
“SS Evelyn moored pierside, circa 1917-18, place unknown” by https://commons.wikimedia.org/wiki/File:USS_Asterion_%28AK-100%29_01.jpg.
“USS O’Brien (Destroyer # 51) bringing in a convoy, 1918.” by https://commons.wikimedia.org/wiki/File:USS_O’Brien_%28DD-51%29_in_dazzle_camouflage,_1918.jpg.
“Submarine commander’s periscope view of a merchant ship in dazzle camouflage (left) and the same ship uncamouflaged (right).” by https://en.wikipedia.org/wiki/File:EB1922_Camouflage_Periscope_View.jpg.
“German World War I observation post disguised as a tree.” by https://en.wikipedia.org/wiki/File:Disguisetactics.jpg.
“Boys! Remember Nurse Cavell Poster” by http://digital.slv.vic.gov.au/view/action/nmets.do?DOCCHOICE=950418.xml&dvs=1488473255147~802&locale=en_US&search_terms=&adjacency=&VIEWER_URL=/view/action/nmets.do?&DELIVERY_RULE_ID=4&divType=&usePid1=true&usePid2=true.
“Title: Times are hard your Majesty you leave us nothing to do poster” by https://commons.wikimedia.org/wiki/File:%22Times_are_hard_your_Majesty_-_you_leave_us_nothing_to_do%22_LCCN00651848.jpg.
Forum Feedback Instructions
Your Initial Post must be:
posted by Thursday evening;
a minimum of 250 words and closer to 500 words; and
contain reference to at least two of the lesson’s assigned readings.
Peer Responses: Respond to at least 2 other students
By Sunday evening; with
Responses between 100-200 words;
Containing reference to at least one of the lesson’s assigned readings; and
Include direct questions.
Forum Engagement & Professor Queries: In addition, you need to:
Monitor the postings throughout the week; and
Respond to my queries/questions.
Initial Post Due: Thursday, by 11:55pm ET
Responses Due: Sunday, by 11:55pm ET
Previous answers to this question
This is a preview of an assignment submitted on our website by a student. If you need help with this question or any assignment help, click on the order button below and get started. We guarantee authentic, quality, 100% plagiarism free work or your money back.