# Analysis - Part 5

Ed note: This is Part 5 of a paper written in June 1999 by Lisa Krizan of the Joint Military Intelligence College. It is reprinted here by permission. Parts 1, 2, 3 and 4 were published as complete documents, as chapters in a book (thus the odd footnote and figure numbering) and also as a general overview of the intelligence process on the Society for Competitive Intelligence Professionals (SCIP) website. This paper discusses analyzing collected data, and is part of our series on the "Secret Amateur Spy."
Analysis is the breaking down of a large problem into a number of smaller problems and performing mental operations on the data in order to arrive at a conclusion or a generalization. It involves close examination of related items of information to determine the extent to which they conﬁrm, supplement, or contradict each other and thus to establish probabilities and relationships. - Mathams, 88.
Analysis is not merely reorganizing data and information into a new format. At the very least, analysis should fully describe the phenomenon under study, accounting for as many relevant variables as possible. At the next higher level of analysis, a thorough explanation of the phenomenon is obtained, through interpreting the signiﬁcance and effects of its elements on the whole. Ideally, analysis can reach successfully beyond the descriptive and explanatory levels to synthesis and effective persuasion, often referred to as estimation.

The purpose of intelligence analysis is to reveal to a speciﬁc decisionmaker the underlying signiﬁcance of selected target information. Frequently intelligence analysis involves estimating the likelihood of one possible outcome, given the many possibilities in a particular scenario. This function is not to be confused with prediction, as no one can honestly be credited with predicting the future. However, intelligence analysis does appropriately involve forecasting, "which requires the explicit statement by the analyst of the degree of conﬁdence held in a certain set of judgments, based upon a certain set of explicit facts or assumptions."45 Different levels of analysis result in corresponding levels of conclusions that may be traced along an "Intelligence Food Chain."46 This concept, illustrated in the following table, is equally applicable in government and business intelligence.

 Table 9: The Intelligence Food Chain Source: adapted from Davis, Analytic Tradecraft Facts - veriﬁed information related to an intelligence issue (for example: events, measured characteristics). Findings - expert knowledge based on organized information that indicates, for example, what is increasing, decreasing, changing, taking on a pattern. Forecasts - judgments based on facts and ﬁndings and defended by sound and clear argumentation. Fortunetelling - inadequately explained and defended judgments.

Intelligence analysts may use this Food Chain model to measure their adherence to rigorous analytic thought - how far to go with their analytic judgments, and where to draw the line. The mnemonic "Four Fs Minus One" may serve as a reminder of how to apply this criterion. Whenever the intelligence information allows, and the customer's validated needs demand it, the intelligence analyst will extend the thought process as far along the Food Chain as possible, to the third "F" but not beyond to the fourth.

Types of Reasoning
Objectivity is the intelligence analyst's primary asset in creating intelligence that meets the Four Fs Minus One criterion. More than simply a conscientious attitude, objectivity is "a professional ethic that celebrates tough-mindedness and clarity in applying rules of evidence, inference, and judgment."47 To produce intelligence objectively, the analyst must employ a process tailored to the nature of the problem. Four basic types of reasoning apply to intelligence analysis: induction, deduction, abduction and the scientiﬁc method.

Induction. The induction process is one of discovering relationships among the phenomena under study. For example, an analyst might discover from systematic examination of media reports that Country "X" had been issuing aggressive statements prior to formally announcing an arms agreement with Country "Y." Or an analyst may notice that a characteristic sequence of events always precedes Country "Z's" nuclear weapons tests.48 In the words of Clause and Weir:
Induction is the intellectual process of drawing generalizations on the basis of observations or other evidence. Induction takes place when one learns from experience. For example, induction is the process by which a per-son learns to associate the color red with heat and heat with pain, and to generalize these associations to new situations.
Induction occurs when one is able to postulate causal relationships. Intelligence estimates are largely the result of inductive processes, and, of course, induction takes place in the formulation of every hypothesis. Unlike other types of intellectual activities such as deductive logic and mathematics, there are no established rules for induction.49
Deduction. "Deduction is the process of reasoning from general rules to particular cases. Deduction may also involve drawing out or analyzing premises to form a conclusion."50 In the case of Country "Z" above, the analyst noted a pattern of events related to testing of nuclear weapons. Later, after noticing this series of events occurring in Country "Z," the analyst may conclude that another nuclear weapons test is about to take place in that coun try. The ﬁrst premise, that certain events were related to weapons testing, was derived inductively - from speciﬁc observations to a conclusion. The second premise, that another test was imminent, was derived deductively - from a generalization to a speciﬁc case.51

Deduction works best in closed systems such as mathematics, formal logic, or certain kinds of games in which all the rules are clearly spelled out. For example, the validity and truthfulness of the following conclusion is apparent to anyone with a knowledge of geometry: "This is a triangle, therefore the sum of the interior angles will equal 180 degrees." In closed systems, properly drawn deductive conclusions are always valid.52

However, intelligence analysis rarely deals with closed systems, so premises assumed to be true may in fact be false, and lead to false conclusions. For example, in the weapons testing case above, Country "Z" may have deliberately deceived potential observers by falsely staging activities similar to those usually taken before a real weapons test. A conclusion that observed activities signalled a real test would be false in this case. Thus, as human activities rarely involve closed systems, deduction must be used carefully in intelligence analysis.53

Readers interested in further study into the use of deductive logic in estimative intelligence may wish to read the work of Israeli intelligence analyst Isaac Ben-Israel on this subject.54 At the Joint Military Intelligence College, one student, Navy Lieutenant Donald Carney, explored the application of deductive logic to intelligence collection and analysis decisions in estimating the disintegration of Yugoslavia. Carney showed that Ben-Israel's "critical method" of inquiry could be applied prospectively to the collection of information to refute speciﬁc hypotheses, allowing for an unusually deﬁnitive estimate of the likelihood of each outcome.55

Abduction. Abduction is the process of generating a novel hypothesis to explain given evidence that does not readily suggest a familiar explanation. This process differs from induction in that it adds to the set of hypotheses available to the analyst. In inductive reasoning, the hypothesized relationship among pieces of evidence is considered to be already existing, needing only to be perceived and articulated by the analyst. In abduction, the analyst creatively generates an hypothesis, then sets about examining whether the available evidence unequivocally leads to the new conclusion. The latter step, testing the evidence, is a deductive inference.56

Abductive reasoning may also be called intuition, inspiration, or the "Ah-ha!" experience. It characterizes the analyst's occasional ability to come to a conclusion spontaneously, often without a sense of having consciously taken deﬁnable steps to get there. While the abduction process may not be easily deﬁned or taught, it may be encouraged by providing analysts with a wide array of research material and experiences, and by sup-porting the expenditure of time and energy on creative thinking.57

Examples of abductive reasoning in intelligence analysis include situations in which the analyst has a nagging suspicion that something of intelligence value has happened or is about to happen, but has no immediate explanation for this conclusion. The government intelligence analyst may conclude that an obscure rebel faction in a target country is about to stage a political coup, although no overt preparations for the takeover are evident. The business analyst may determine that a competitor company is on the brink of a dramatic shift from its traditional product line into a new market, even though its balance sheet and status in the industry are secure. In each case, the analyst, trusting this sense that the time is right for a signiﬁcant event, will set out to gather and evaluate evidence in light of the new, improbable, yet tantalizing hypothesis.

Scientiﬁc Method. The scientiﬁc method combines deductive and inductive reasoning: Induction is used to develop the hypothesis, and deduction is used to test it. In science, the analyst obtains data through direct observation of the subject and formulates an hypothesis to explain conclusions suggested by the evidence. Experiments on the subject are devised and conducted to test the validity of the hypothesis. If the experimental results match the expected outcome, then the hypothesis is validated; if not, then the analyst must develop a new hypothesis and appropriate experimental methods.58

In intelligence analysis, the analyst typically does not have direct access to the observable subject, but gathers information indirectly. From these gathered data, the intelligence analyst may proceed with the scientiﬁc method by generating tentative explanations for a subject event or phenomenon. Next, each hypothesis is examined for plausibility and compared against newly acquired information, in a continual process toward reaching a conclusion. Often the intelligence analyst tests several hypotheses at the same time, whereas the scientist usually focuses on one at a time. Furthermore, intelligence analysts cannot usually experiment directly upon the subject matter as in science, but must generate ﬁctional scenarios and rigorously test them through mental processes such as those suggested below.59

Methods of Analysis
Opportunity Analysis. Opportunity analysis identiﬁes for policy ofﬁcials opportunities or vulnerabilities that the customer's organization can exploit to advance a policy, as well as dangers that could undermine a policy.60 It identiﬁes institutions, interest groups, and key leaders in a target country or organization that support the intelligence customer's objective; the means of enhancing supportive elements; challenges to positive elements (which could be diminished or eliminated); logistic, ﬁnancial, and other vulnerabilities of adversaries; and activities that could be employed to rally resources and support to the objective.61 Jack Davis notes that in the conduct of opportunity analysis,
[T]he analyst should start with the assumption that every policy concern can be transformed into a legitimate intelligence concern. What follows from this is that analysts and their managers should learn to think like a policy-maker in order to identify the issues on which they can provide utility, but they should always [behave like intelligence producers]. ... The ﬁrst step in producing effective opportunity analysis is to redeﬁne an intelligence issue in the policymaker's terms. This requires close attention to the policymaker's role as "action ofﬁcer" - reﬂecting a preoccupation with getting things started or stopped among adversaries and allies.... It also requires that analysts recognize a policy ofﬁcial's propensity to take risk for gain....[P]olicymakers often see, say, a one-in-ﬁve chance of turning a situation around as a sound investment of [organizational] prestige and their professional energies....[A]nalysts have to search for appropriate ways to help the policymaker inch the odds upward - not by distorting their bottom line when required to make a predictive judgment, or by cheerleading, but by pointing to opportunities as well as obstacles. Indeed, on politically sensitive issues, analysts would be well advised to utilize a matrix that ﬁrst lists and then assesses both the promising and discouraging signs they, as objective observers, see for... policy goals.... [P]roperly executed opportunity analysis stresses information and possibilities rather than [explicit] predictions.62
Linchpin Analysis. Linchpin analysis is one way of showing intelligence managers and policy ofﬁcials alike that all the bases have been touched. Linchpin analysis, a colorful term for structured forecasting, is an anchoring tool that seeks to reduce the hazard of self-inﬂicted intelligence error as well as policymaker misinterpretation. At a minimum, linchpin tradecraft promotes rigor through a series of predrafting checkpoints, outlined below. Analysts can also use it to organize and evaluate their text when addressing issues of high uncertainty. Reviewing managers can use - and have used - linchpin standards to ensure that the argument in such assessments is sound and clear.63

 Table 10: Steps in Linchpin Analysis Source: Davis, Analytic Tradecraft 1. Identify the main uncertain factors or key variables judged likely to drive the outcome of the issue, forcing systematic attention to the range of and relationships among factors at play. 2. Determine the linchpin premises or working assumptions about the drivers. This encourages testing of the key subordinate judgments that hold the estimative conclusion together. 3. Marshal ﬁndings and reasoning in defense of the linchpins, as the premises that warrant the conclusion are subject to debate as well as error. 4. Address the circumstances under which unexpected developments could occur. What indicators or patterns of development could emerge to signal that the linchpins were unreliable? And what triggers or dramatic internal and external events could reverse the expected momentum?

Analogy. Analogies depend on the real or presumed similarities between two things. For example, analysts might reason that because two aircraft have many features in common, they may have been designed to perform similar missions. The strength of any such analogy depends upon the strength of the connection between a given condition and a speciﬁed result. In addition, the analyst must consider the characteristics that are dissimilar between the phenomena under study. The dissimilarities may be so great that they render the few similarities irrelevant.
One of the most widely used tools in intelligence analysis is the analogy. Analogies serve as the basis for most hypotheses, and rightly or wrongly, underlie many generalizations about what the other side will do and how they will go about doing it.64
Thus, drawing well-considered generalizations is the key to using analogy effectively. When postulating human behavior, the analyst may effectively use analogy by applying it to a speciﬁc person acting in a situation similar to one in which his actions are well documented: an election campaign or a treaty negotiation, for example. However, an assumption that a different individual running for the same ofﬁce or negotiating a similar treaty would behave the same way as his predecessor may be erroneous. The key condition in this analogy is the personality of the individual, not the similar situations. This principle of appropriate comparison applies equally to government and business intelligence analysis.
Analogies are used in many different kinds of intelligence analyses from military and political to industrial intelligence. For example, major U.S. auto makers purchase their competitors' models as soon as they appear in the showrooms. The new cars are taken to laboratories where they are completely and methodically disassembled. Reasoning by analogy, that is, assuming that it would cost one producer the same amount to produce or purchase the same components used by another, the major auto producers can estimate their competitors' per-unit production costs, any cost-saving measures taken, and how much proﬁt is likely to be earned by the sale of a single unit.65
Customer Focus
As with the previous stages of the intelligence process, effective analysis depends upon a good working relationship between the intelligence customer and producer. A signiﬁcant difference exists between the public and private sectors with regard to this customer-producer relationship. Government analysts typically beneﬁt from close interaction with policymakers by virtue of their well understood institutional position. The same is not often true in the business world, where the intelligence analyst's role is not yet well institutionalized.

The government intelligence analyst is generally considered a legitimate and necessary policymaking resource, and even fairly junior employees may be accepted as national experts by virtue of the knowledge and analytic talent they offer to high level customers. Conversely, in the private sector, the intelligence analyst's corporate rank is generally orders of magnitude lower than that of a company vice-president or CEO. The individual analyst may have little access to the ultimate customer, and the intelligence service as a whole may receive little favor from a senior echelon that makes little distinction between so-called intelligence and the myriad of other decisionmaking inputs. When private sector practitioners apply validated methods of analysis geared to meet speciﬁc customer needs, they can win the same kind of customer appreciation and support as that enjoyed by government practitioners.

Statistical Tools
Additional decisionmaking tools derived from parametric or non-parametric statistical techniques, such as Bayesian analysis, are sometime used in intelligence. An exploration of them is beyond the scope of this study. Many of the statistically oriented tools continue to rely fundamentally on human judgment to assign values to variables, so that close attention to the types of reasoning and methods of analysis presented herein remain the fundamental analytical precondition to their use.66

Analytic Mindset
Customer needs and collected information and data are not the only factors that inﬂuence the analytic process; the analyst brings his or her own unique thought patterns as well. This personal approach to problem-solving is "the distillation of the intelligence analyst's cumulative factual and conceptual knowledge into a framework for making estimative judgments on a complex subject."67 Mindset helps intelligence analysts to put a situation into context, providing a frame of reference for examining the subject. Analysis could not take place if thinking were not bounded by such constructs. However, mindset can also lead analysts to apply certain viewpoints inappropriately or exclusively while neglecting other potentially enlightening perspectives on an issue. While no one can truly step outside his or her own mindset, becoming aware of potential analytic pitfalls can enable intelligence analysts to maximize the positive effects of mindset while minimizing the negatives.68 Analysts can use the accompanying list of analytical pitfalls to determine which, if any, they may be applying in their work, and whether the relevant ones are accounted for in their analytic tasks.

Categories of Misperception and Bias69

Evoked-Set Reasoning:
That information and concern which dominates one's thinking based on prior experience. One tends to uncritically relate new information to past or current dominant concerns.

Prematurely Formed Views: These spring from a desire for simplicity and stability, and lead to premature closure in the consideration of a problem.

Presumption that Support for One Hypothesis Disconﬁrms Others:
Evidence that is consistent with one's preexisting beliefs is allowed to disconﬁrm other views. Rapid closure in the consideration of an issue is a problem.

Inappropriate Analogies:
Perception that an event is analogous to past events, based on inadequate consideration of concepts or facts, or irrelevant criteria. Bias of "Representativeness."

Superﬁcial Lessons From History:
Uncritical analysis of concepts or events, superﬁcial causality, over-generalization of obvious factors, inappropriate extrapolation from past success or failure.

Presumption of Unitary Action by Organizations:
Perception that behavior of others is more planned, centralized, and coordinated than it really is. Dismisses accident and chaos. Ignores misperceptions of others. Fundamental attribution error, possibly caused by cultural bias.

Organizational Parochialism: Selective focus or rigid adherence to prior judgments based on organizational norms or loyalties. Can result from functional specialization. Group-think or stereotypical thinking.

Excessive Secrecy (Compartmentation): Over-narrow reliance on selected evidence. Based on concern for operational security. Narrows consideration of alternative views. Can result from or cause organizational parochialism.

Ethnocentrism:
Projection of one's own culture, ideological beliefs, doctrine, or expectations on others. Exaggeration of the causal signiﬁcance of one's own action. Can lead to mirror-imaging and wishful thinking. Parochialism.

Lack of Empathy: Undeveloped capacity to understand others' perception of their world, their conception of their role in that world, and their deﬁnition of their interests. Difference in cognitive contexts.

Mirror-Imaging: Perceiving others as one perceives oneself. Basis is ethnocentrism. Facilitated by closed systems and parochialism.

Ignorance: Lack of knowledge. Can result from prior-limited priorities or lack of curiosity, perhaps based on ethnocentrism, parochialism, denial of reality, rational-actor hypothesis (see next entry).

Rational-Actor Hypothesis: Assumption that others will act in a "rational" manner, based on one's own rational reference. Results from ethnocentrism, mirror-imaging, or ignorance.

Denial of Rationality: Attribution of irrationality to others who are perceived to act outside the bounds of one's own standards of behavior or decisionmaking. Opposite of rational-actor hypothesis. Can result from ignorance, mirror-imaging, parochialism, or ethnocentrism.

Proportionality Bias:
Expectation that the adversary will expend efforts proportionate to the ends he seeks. Inference about the intentions of others from costs and consequences of actions they initiate.

Willful Disregard of New Evidence:
Rejection of information that conﬂicts with already-held beliefs. Results from prior policy commitments, and/or excessive pursuit of consistency.

Image and Self-Image:
Perception of what has been, is, will be, or should be (image as subset of belief system). Both inward-directed (self-image) and outward-directed (image). Both often inﬂuenced by self-absorption and ethnocentrism.

Defensive Avoidance:
Refusal to perceive and understand extremely threatening stimuli. Need to avoid painful choices. Leads to wishful thinking.

Overconﬁdence in Subjective Estimates: Optimistic bias in assessment. Can result from premature or rapid closure of consideration, or ignorance.

Wishful Thinking (Pollyanna Complex):
Hyper-credulity. Excessive optimism born of smugness and overconﬁdence.

Best-Case Analysis:
Optimistic assessment based on cognitive predisposition and general beliefs of how others are likely to behave, or in support of personal or organizational interests or policy preferences.

Conservatism in Probability Estimation:
In a desire to avoid risk, tendency to avoid estimating extremely high or extremely low probabilities. Routine thinking. Inclination to judge new phenomena in light of past experience, to miss essentially novel situational elements, or failure to reexamine established tenets. Tendency to seek conﬁrmation of prior-held beliefs.

Worst-Case Analysis (Cassandra Complex): Excessive skepticism. Reﬂects pessimism and extreme caution, based on predilection (cognitive predisposition), adverse past experience, or on support of personal or organizational interests or policy preferences.

Because the biases and misperceptions outlined above can inﬂuence analysis, they may also affect the resultant analytic products. As explained in the following Part, analysis does not cease when intelligence production begins; indeed, the two are interdependent. The fore-going overview of analytic pitfalls should caution intelligence managers and analysts that intelligence products should remain as free as possible from such errors of omission and commission, yet still be tailored to the speciﬁc needs of customers. Consistently reminding intelligence producers of the dangers and beneﬁts of mindset may help them avoid errors and polish their analytic skills. In addition, managers may conduct post-production evaluation of intelligence products, using the biases and misperceptions listed above to identify strengths and weaknesses in individual analysts' work, and to counsel them accordingly.

45 Dearth, "National Intelligence," 25.
46 Adapted from Jack Davis, Intelligence Changes in Analytic Tradecraft in CIA's Directorate of Intelligence, (Washington, DC: CIA Directorate of Intelligence, April 1995), 6.
48 Clauser and Weir, 81.
49 Clauser and Weir, 81.
50 Clauser and Weir, 81.
51 Clauser and Weir, 82-83.
52 Clauser and Weir, 83.
53 Clauser and Weir, 83-84.
54 Isaac Ben-Israel, "Philosophy and Methodology of Intelligence: The Logic of Estimative Process," Intelligence and National Security 4, no. 4 (October 1989): 660-718.
55 LT Donald J. Carney, USN, Estimating the Dissolution of Yugoslavia, Seminar Paper (Washington, DC: Joint Military Intelligence College, September 1991).
56 David A. Schum, Evidence and Inference for the Intelligence Analyst, Volume I (Lanham, MD: University Press of America, 1987): 20.
57 The relationship of this type of reasoning to Eastern philosophy is addressed in LCDR William G. Schmidlin, USN, Zen and the Art of Intelligence Analysis, MSSI Thesis (Washington, DC: Joint Military Intelligence College, July 1993).
58 Mathams, 91. A seminal contribution to understanding scientiﬁc method is Abraham Kaplan's The Conduct of Inquiry (San Francisco, CA: Chandler, 1964). The applicability of this method in social science, and therefore, in intelligence, is developed in Earl Babbie's The Practice of Social Research (Belmont, CA: Wadsworth Publishing Co, 1992).
59 Mathams, 91.

60 Jack Davis, The Challenge of Opportunity Analysis (Washington, DC: Center for the Study of Intelligence, July 1992), v.
61 Davis, Opportunity Analysis, 7.
62 Davis, Opportunity Analysis, 12-13.
64 Clauser and Weir, 246-248.
65 Clauser and Weir, 248-250.
66 Editor's note: A former JMIC faculty member, Douglas E. Hunter, explores the intelligence applications of Bayesian Analysis in Political/Military Applications of Bayesian Analysis: Metodological Issues (Boulder, CO: Westview, 1984).
67 Jack Davis, "Combatting Mindset," Studies in Intelligence 35, no. 4 (Winter 1991): 13-18.
68 Davis, "Combatting Mindset," 13-15.
69 Excerpted from Dearth, "The Politics of Intelligence," 106-107.

Published Thursday, November 16th, 2006

Written by