It is proposed that this article be deleted because of the following concern:
If you can address this concern by improving, copyediting, sourcing, renaming, or merging the page, please edit this page and do so. You may remove this message if you improve the article or otherwise object to deletion for any reason. Although not required, you are encouraged to explain why you object to the deletion, either in your edit summary or on the talk page. If this template is removed, do not replace it . The article may be deleted if this message remains in place for seven days, i.e., after 18:52, 23 June 2023 (UTC). Find sources: "Cognitive traps for intelligence analysis" – news · newspapers · books · scholar · JSTOR |
This article has multiple issues. Please help improve it or discuss these issues on the talk page . (Learn how and when to remove these template messages)
|
Intelligence analysis is plagued by many of the cognitive traps also encountered in other disciplines. The first systematic study "Psychology of Intelligence Analysis" focusing on the specific pitfalls lying between an intelligence analyst and clear thinking was carried out by Dick Heuer in 1999. [1] According to Heuer, these "cognitive traps for intelligence analysis" may be rooted either in the analyst's organizational culture or his or her own personality.
The most common personality trap, known as mirror-imaging [2] is the analysts' assumption that the people being studied think like the analysts themselves. An important variation is to confuse actual subjects with one's information or images about them, as the sort of apple one eats and the ideas and issues it may raise. It poses a dilemma for the scientific method in general, since science uses information and theory to represent complex natural systems as if theoretical constructs might be in control of indefinable natural processes. An inability to distinguish subjects from what one is thinking about them is also studied under the subject of functional fixedness, first studied in Gestalt psychology and in relation to the subject–object problem.
Experienced analysts may recognize that they have fallen prey to mirror-imaging if they discover that they are unwilling to examine variants of what they consider most reasonable in light of their personal frame of reference. Less-perceptive analysts affected by this trap may regard legitimate objections as a personal attack, rather than looking beyond ego to the merits of the question. Peer review (especially by people from a different background) can be a wise safeguard. Organizational culture can also create traps which render individual analysts unwilling to challenge acknowledged experts in the group.
Another trap, target fixation , has an analogy in aviation: it occurs when pilots become so intent on delivering their ordnance that they lose sight of the big picture and crash into the target. This is a more basic human tendency than many realize. Analysts may fixate on one hypothesis, looking only at evidence that is consistent with their preconceptions and ignoring other relevant views. The desire for rapid closure is another form of idea fixation.
"Familiarity with terrorist methods, repeated attacks against U.S. facilities overseas, combined with indications that the continental United States was at the top of the terrorist target list might have alerted us that we were in peril of a significant attack. And yet, for reasons those who study intelligence failure will find familiar, 9/11 fits very much into the norm of surprise caused by a breakdown of intelligence warning." [3] The breakdown happened, in part, because there was poor information-sharing among analysts (in different FBI offices, for example). At a conceptual level, US intelligence knew that al-Qaida actions almost always involve multiple, near-simultaneous attacks; however, the FBI did not assimilate piecemeal information on oddly behaving foreign flight-training students into this context.
On the day of the hijackings (under tremendous time pressure), no analyst associated the multiple hijackings with the multiple-attack signature of al-Qaeda. The failure to conceive that a major attack could occur within the US left the country unprepared. For example, irregularities detected by the Federal Aviation Administration and North American Air Defense Command did not flow into a center where analysts could consolidate this information and (ideally) collate it with earlier reports of odd behavior among certain pilot trainees, or the possibility of hijacked airliners being used as weapons.
Inappropriate analogies are yet another cognitive trap. Though analogies may be extremely useful they can become dangerous when forced, or when they are based on assumptions of cultural or contextual equivalence. Avoiding such analogies is difficult when analysts are merely unconscious of differences between their own context and that of others; it becomes extremely difficult when they are unaware that important knowledge is missing. Difficulties associated with admitting one's ignorance are an additional barrier to avoiding such traps. Such ignorance can take the form of insufficient study: a lack of factual information or understanding; an inability to mesh new facts with old; or a simple denial of conflicting facts.
Even extremely creative thinkers may find it difficult to gain support within their organization. Often more concerned with appearances, managers may suppress conflict born of creativity in favor of the status quo. A special case of stereotyping is stovepiping, whereby a group heavily invested in a particular collection technology ignores valid information from other sources (functional specialization). It was a Soviet tendency to value HUMINT (HUMan INTelligence), gathered from espionage, above all other sources; the Soviet OSINT was forced to go outside the state intelligence organization in developing the USA (later USA-Canada) Institute of the Soviet Academy of Sciences. [4]
Another specialization problem may come as a result of security compartmentalization. An analytic team with unique access to a source may overemphasize that source's significance. This can be a major problem with long-term HUMINT relationships, in which partners develop personal bonds.
Groups (like individual analysts) can also reject evidence which contradicts prior conclusions. When this happens it is often difficult to assess whether the inclusion of certain analysts in the group was the thoughtful application of deliberately contrarian "red teams", or the politicized insertion of ideologues to militate for a certain policy. Monopolization of the information flow (as caused by the latter) has also been termed "stovepiping", by analogy with intelligence-collection disciplines.
There are many levels at which one can misunderstand another culture, be it that of an organization or a country. One frequently encountered trap is the rational-actor hypothesis, which ascribes rational behavior to the other side, according to a definition of rationality from one's own culture.
The social anthropologist Edward T. Hall illustrated one such conflict [5] with an example from the American Southwest. "Anglo" drivers became infuriated when "Hispanic" traffic police would cite them for going 1 mi/h over the speed limit although a Hispanic judge would later dismiss the charge. "Hispanic" drivers, on the other hand, were convinced that "Anglo" judges were unfair because they would not dismiss charges because of extenuating circumstances.
Both cultures were rational with regard to law enforcement and the adjudication of charges; indeed, both believed that one of the two had to be flexible and the other had to be formal. However, in the Anglo culture, the police had discretion with regard to issuing speeding tickets, and the court was expected to stay within the letter of the law. In the Hispanic culture, the police were expected to be strict, but the courts would balance the situation. There was a fundamental misunderstanding; both sides were ethnocentric, and both incorrectly assumed the other culture was a mirror image of itself. In that example, denial of rationality was the result in both cultures, yet each was acting rationally within its own value set.
In a subsequent interview, Hall spoke widely about intercultural communication. [6] He summed up years of study with this statement: "I spent years trying to figure out how to select people to go overseas. This is the secret. You have to know how to make a friend. And that is it!"
To make a friend, one has to understand the culture of the potential friend, one's own culture, and how things which are rational in one may not translate to the other. Key questions are:
Hall asserted:
If we can get away from theoretical paradigms and focus more on what is really going on with people, we will be doing well. I have two models that I used originally. One is the linguistics model, that is, descriptive linguistics. And the other one is animal behavior. Both involve paying close attention to what is happening right under our nose. There is no way to get answers unless you immerse yourself in a situation and pay close attention. From this, the validity and integrity of patterns is experienced. In other words, the pattern can live and become a part of you.
The main thing that marks my methodology is that I really do use myself as a control. I pay very close attention to myself, my feelings because then I have a base. And it is not intellectual.
Proportionality bias assumes that small things in one culture are small in every culture. In reality, cultures prioritize differently. In Western (especially Northern European) culture, time schedules are important, and being late can be a major discourtesy. Waiting one's turn is the cultural norm, and failing to stand in line is a cultural failing. "Honor killing" seems bizarre in some cultures but is an accepted part of others.
Even within a culture, however, individuals remain individual. Presumption of unitary action by organizations is another trap. In Japanese culture, the lines of authority are very clear, but the senior individual will also seek consensus. American negotiators may push for quick decisions, but the Japanese need to build consensus first; once it exists, they may execute it faster than Americans.
The analyst's country (or organization) is not identical to that of their opponent. One error is to mirror-image the opposition, assuming it will act the same as one's country and culture would under the same circumstances. "It seemed inconceivable to the U.S. planners in 1941 that the Japanese would be so foolish to attack a power whose resources so exceeded those of Japan, thus virtually guaranteeing defeat". [3]
In like manner, no analyst in US Navy force protection conceived of an Arleigh Burke-class destroyer such as the USS Cole being attacked with a small suicide boat, much like those the Japanese planned to use extensively against invasion forces during World War II.
An opponent's cultural framework affects its approach to technology. That complicates the task of one's own analysts in assessing the opponent's resources, how they may be used and defining intelligence targets accordingly. Mirror-imaging, committing to a set of common assumptions rather than challenging those assumptions, has figured in numerous intelligence failures.
In the Pacific Theater of World War II, the Japanese seemed to believe that their language was so complex that even if their cryptosystems such as Type B Cipher Machine (Code Purple) were broken, outsiders would not really understand the content. That was not strictly true, but it was sufficiently that there were cases that even the intended recipients did not clearly understand the writer's intent.[ citation needed ]
On the other side, the US Navy assumed that ships anchored in the shallow waters of Pearl Harbor were safe from torpedo attack even though in 1940, at the Battle of Taranto, the British had made successful shallow-water torpedo attacks against Italian warships in harbor.
Even if intelligence services had credited the September 11, 2001 attacks conspirators with the organizational capacity necessary to hijack four airliners simultaneously, no one would have suspected that the hijackers' weapon of choice would be the box cutter. [3]
Likewise, the US Navy underestimated the danger of suicide boats in harbor and set rules of engagement that allowed an unidentified boat to sail into the USS Cole without being warned off or fired on. A Burke-class destroyer is one of the most powerful[ citation needed ] warships ever built, but US security policies did not protect the docked USS Cole. [3]
Mirror-imaging can be a major problem for policymakers, as well as analysts. During the Vietnam War, Lyndon B. Johnson and Robert S. McNamara assumed that Ho Chi Minh would react to situations in the same manner as they would. Similarly, in the run-up to the Gulf War, there was a serious misapprehension independent of politically-motivated intelligence manipulation that Saddam Hussein would view the situation as both Kuwait as the State Department and White House did.
Opposing countries are not monolithic, even within their governments. There can be bureaucratic competition, which becomes associated with different ideas. Some dictators, such as Hitler and Stalin, were known for creating internal dissension, so that only the leader was in complete control. A current issue, which analysts understand but politicians may not or may want to exploit by playing on domestic fears, is the actual political and power structure of Iran; one must not equate the power of Iran's president with that of the president of the United States.
Opponents are not always rational. They may have a greater risk tolerance than one's own country. Maintaining the illusion of a WMD threat appears to have been one of Saddam Hussein's survival strategies. Returning to the Iranian example, an apparently-irrational statement from Iranian President Mahmoud Ahmadinejad would not carry the weight of a similar statement by Supreme Leader Ali Khamenei. Analysts sometimes assume that the opponent is totally wise and knows all of the other side's weaknesses. Despite that danger, opponents are unlikely to act according to one's best-case scenario; they may take the worst-case approach to which one is most vulnerable.
The analysts are to form hypotheses but should also be prepared to reexamine them repeatedly in light of new information instead of searching for evidence buttressing their favored theory. They must remember that the enemy may be deliberately deceiving them with information that seems plausible to the enemy. Donald Bacon observed that "the most successful deception stories were apparently as reasonable as the truth. Allied strategic deception, as well as Soviet deception in support of the operations at Stalingrad, Kursk, and the 1944 summer offensive, all exploited German leadership's preexisting beliefs and were, therefore, incredibly effective." [7] Theories that Hitler thought to be implausible were not accepted. Western deception staffs alternated "ambiguous" and "misleading" deceptions; the former intended simply to confuse analysts and the latter to make one false alternative especially likely.
Of all modern militaries, the Russians treat strategic deception (or, in their word, maskirovka , which goes beyond the English phrase to include deception, operational security and concealment) as an integral part of all planning. The highest levels of command are involved. [8]
Bacon wrote further:
The battle of Kursk was also an example of effective Soviet maskirovka. While the Germans were preparing for their Kursk offensive, the Soviets created a story that they intended to conduct only defensive operations at Kursk. The reality was the Soviets planned a large counteroffensive at Kursk once they blunted the German attack.... German intelligence for the Russian Front assumed the Soviets would conduct only "local" attacks around Kursk to "gain" a better jumping off place for the winter offensive.
The counterattack by the Steppe Front stunned the Germans.
The opponent may try to overload one's analytical capabilities [9] as a gambit for those preparing the intelligence budget and for agencies whose fast track to promotion is in data collection; one's own side may produce so much raw data that the analyst is overwhelmed, even without enemy assistance.
The Strategic Defense Initiative (SDI), derisively nicknamed the "Star Wars program", was a proposed missile defense system intended to protect the United States from attack by ballistic strategic nuclear weapons. The concept was announced on March 23, 1983, by President Ronald Reagan, a vocal critic of the doctrine of mutually assured destruction (MAD), which he described as a "suicide pact". Reagan called upon American scientists and engineers to develop a system that would render nuclear weapons obsolete. Elements of the program reemerged in 2019 with the Space Development Agency (SDA).
Operation Anadyr was the code name used by the Soviet Union for its Cold War secret operation in 1962 of deploying ballistic missiles, medium-range bombers, and a division of mechanized infantry to Cuba to create an army group that would be able to prevent an invasion of the island by United States forces. The plan was to deploy approximately 60,000 personnel in support of the main missile force, which consisted of three R-12 missile regiments and two R-14 missile regiments. However, part of it was foiled when the United States discovered the plan, prompting the Cuban Missile Crisis.
Counterintelligence (counter-intelligence) or counterespionage (counter-espionage) is any activity aimed at protecting an agency's intelligence program from an opposition's intelligence service. It includes gathering information and conducting activities to prevent espionage, sabotage, assassinations or other intelligence activities conducted by, for, or on behalf of foreign powers, organizations or persons.
A false flag operation is an act committed with the intent of disguising the actual source of responsibility and pinning blame on another party. The term "false flag" originated in the 16th century as an expression meaning an intentional misrepresentation of someone's allegiance. The term was famously used to describe a ruse in naval warfare whereby a vessel flew the flag of a neutral or enemy country in order to hide its true identity. The tactic was originally used by pirates and privateers to deceive other ships into allowing them to move closer before attacking them. It later was deemed an acceptable practice during naval warfare according to international maritime laws, provided the attacking vessel displayed its true flag once an attack had begun.
Essence of Decision: Explaining the Cuban Missile Crisis is book by political scientist Graham T. Allison analyzing the 1962 Cuban Missile Crisis. Allison used the crisis as a case study for future studies into governmental decision-making. The book became the founding study of the John F. Kennedy School of Government, and in doing so revolutionized the field of international relations.
Intelligence analysis is the application of individual and collective cognitive methods to weigh data and test hypotheses within a secret socio-cultural context. The descriptions are drawn from what may only be available in the form of deliberately deceptive information; the analyst must correlate the similarities among deceptions and extract a common truth. Although its practice is found in its purest form inside national intelligence agencies, its methods are also applicable in fields such as business intelligence or competitive intelligence.
The analysis of competing hypotheses (ACH) is a methodology for evaluating multiple competing hypotheses for observed data. It was developed by Richards (Dick) J. Heuer, Jr., a 45-year veteran of the Central Intelligence Agency, in the 1970s for use by the Agency. ACH is used by analysts in various fields who make judgments that entail a high risk of error in reasoning. ACH aims to help an analyst overcome, or at least minimize, some of the cognitive limitations that make prescient intelligence analysis so difficult to achieve.
Russian military deception, sometimes known as maskirovka, is a military doctrine developed from the start of the 20th century. The doctrine covers a broad range of measures for military deception, from camouflage to denial and deception.
Intelligence collection management is the process of managing and organizing the collection of intelligence from various sources. The collection department of an intelligence organization may attempt basic validation of what it collects, but is not supposed to analyze its significance. There is debate in U.S. intelligence community on the difference between validation and analysis, where the National Security Agency may try to interpret information when such interpretation is the job of another agency.
Intelligence cycle management refers to the overall activity of guiding the intelligence cycle, which is a set of processes used to provide decision-useful information (intelligence) to leaders. The cycle consists of several processes, including planning and direction, collection, processing and exploitation, analysis and production, and dissemination and integration. The related field of counterintelligence is tasked with impeding the intelligence efforts of others. Intelligence organizations are not infallible but, when properly managed and tasked, can be among the most valuable tools of management and government.
Intelligence Analysis Management is the process of managing and organizing the analytical processing of raw intelligence information. The terms "analysis", "production", and "processing" denote the organization and evaluation of raw information used in a phase informally called "connecting the dots", thus creating an "intelligence mosaic". The information may result in multiple analytic products, each with different security classifications, time scales, and levels of detail. Intelligence analysis goes back to the beginning of history. Sherman Kent is often considered the father of modern intelligence analysis. His writings include a 1947 book, Strategic Intelligence for American World Policy.
Intelligence dissemination management is a maxim of intelligence arguing that intelligence agencies advise policymakers instead of shaping policy. Due to the necessity of quick decision-making in periods of crisis, intelligence analysts may suggest possible actions, including a prediction of the consequences of each decision. Intelligence consumers and providers still struggle with the balance of what drives information flow. Dissemination is the part of the intelligence cycle that delivers products to consumers, and intelligence dissemination management refers to the process that encompasses organizing the dissemination of the finished intelligence.
National intelligence programs, and, by extension, the overall defenses of nations, are vulnerable to attack. It is the role of intelligence cycle security to protect the process embodied in the intelligence cycle, and that which it defends. A number of disciplines go into protecting the intelligence cycle. One of the challenges is there are a wide range of potential threats, so threat assessment, if complete, is a complex task. Governments try to protect three things:
The Clandestine HUMINT page adheres to the functions within the discipline, including espionage and active counterintelligence.
After the Central Intelligence Agency lost its role as the coordinator of the entire Intelligence Community (IC), special coordinating structures were created by each president to fit his administrative style and the perceived level of threat from terrorists during his term.
This article deals with activities of the U.S. Central Intelligence Agency, specifically dealing with arms control, weapons of mass destruction (WMD) and weapons proliferation. It attempts to look at the process of tasking and analyzing, rather than the problem itself, other than whether the CIA's efforts match its legal mandate or assists in treaty compliance. In some cases, the details of a country's programs are introduced because they present a problem in analysis. For example, if Country X's policymakers truly believe in certain history that may not actually be factual, an analyst trying to understand Country X's policymakers needs to be able to understand their approach to an issue.
Words of estimative probability are terms used by intelligence analysts in the production of analytic reports to convey the likelihood of a future event occurring. A well-chosen WEP gives a decision maker a clear and unambiguous estimate upon which to base a decision. Ineffective WEPs are vague or misleading about the likelihood of an event. An ineffective WEP places the decision maker in the role of the analyst, increasing the likelihood of poor or snap decision making. Some intelligence and policy failures appear to be related to the imprecise use of estimative words.
Failure in the intelligence cycle or intelligence failure, is the outcome of the inadequacies within the intelligence cycle. The intelligence cycle itself consists of six steps that are constantly in motion. The six steps are: requirements, collection, processing and exploitation, analysis and production, dissemination and consumption, and feedback.
The target-centric approach to intelligence is a method of intelligence analysis that Robert M. Clark introduced in his book "Intelligence Analysis: A Target-Centric Approach" in 2003 to offer an alternative methodology to the traditional intelligence cycle. Its goal is to redefine the intelligence process in such a way that all of the parts of the intelligence cycle come together as a network. It is a collaborative process where collectors, analysts and customers are integral, and information does not always flow linearly.
Deception technology is a category of cyber security defense mechanisms that provide early warning of potential cyber security attacks and alert organizations of unauthorized activity. Deception technology products can detect, analyze, and defend against zero-day and advanced attacks, often in real time. They are automated, accurate, and provide insight into malicious activity within internal networks which may be unseen by other types of cyber defense. Deception technology enables a more proactive security posture by seeking to deceive an attacker, detect them and then defeat them.
{{cite web}}
: CS1 maint: multiple names: authors list (link)