when committing analytic resources to draft major assessments; each tailors the specific requirements to Us unique organizational needs. This checklist was first published in Heuer and Pherson, Structured Analytic Techniques for Intelligence Analysis, 47-48 and was slightly revised for this publication. For information about other Decomposition and Visualization techniques analysts can use to instill more rigor into their analysis, see Heuer and Pherson, Structured Analytic Techniques, 43-96, What Is My Analytic Approach? SETTING THE STAGE To turn your conceptual framework into a product, you will need to consider and select from a variety of analytic approaches or strategies. All of these are based on an underlying discipline of logic, which is the meat of analytic argumentation. By explicitly defining an approach, good critical thinkers create a consistent baseline from which they can track their work and show their progress to their boss or colleagues. Having an established plan also makes it easier to adjust the scope of the paper or reset timelines to meet changing needs and deadlines. If you know where you are in the analytic process and can communicate your plan of action, you build credibility and confidence among your peers as well as with your supervisors. A plan or written strategy is a tether to keep you on point and on pace while you explore new sources of data, implications, and possibilities. Much of the literature on critical thinking processes and models focuses on the logic and argumentation thinkers use to make their points, But successful analysis is part of the larger process of inquiry, research, reasoning, and communication that we cover in this book. Good critical thinkers consider the broader picture. They seek the common elements among all the good advice they have received, apply them to specific frames, and tailor them to a strategy that works for the author, the client, and the context of the problem. Your strategy and how much time you allocate to each step will depend on the type of analysis you are producing. Many of us are expected to produce shorter-term, tactical, or current intelligence products on a regular basis; these need to meet a threshold for interest and novelty. Periodically, analysts are asked to take operational or strategic views that explore rationales and implications or to take forward-looking views. For the longer-term projects, concept papers serve as a contract between the authors and their supervisors to ensure that the final product meets everyone's expectations. 146 PART I: HOW DO I GET STARTED? CHAPTER 5: What Is My Analytic Approach? 47 "tf yqu don't know where you're going, you might not get there." LOOKING MORE DEEPLY FIGURE 5.1 Critical Thinking Strategies: The Eight Steps —Yogi Berra When You Come to a fork in tfte fload Take ttf inspiration and Wisdom from One of Baseball's Greatest Heroes' In all analytic endeavors, a good plan begins with defining the client, purpose, and question and then continues with collecting, organizing, and evaluating the information needed to answer the question. One of our colleagues thinks of this process in anthropological terms.2 Primitive man first gathered food and materials he knew existed, then hunted for what else he needed, and finally grew or raised food he could not obtain in other ways. Applied to analysis, this translates as follows: • Gathering—surveying current knowledge relating to the question • Hunting—figuring out what is needed to answer the question • Farming—creating knowledge relevant to answering the question Whether you are writing on gangs, international money laundering, medical care, or nuclear terrorism, success in defining strategies to produce analytic products relies on having a defined context (see Chapter 3). The context or baseline includes both knowledge of what has already been written or collected on the topic to avoid "recreating the wheel" and a plan of action that helps you consciously and systematically with: • Sorting the wheat from the chaff by assessing information needs, identifying gaps, and weeding out irrelevant, incorrect, or misleading information • Setting the parameters for knowledge you need to acquire to formulate good hypotheses and make judgments The intellectual journey from formulating the question to presenting the conclusion can be circuitous depending on: • The clarity of the question to be answered or product generated • The complexity of the problem or issue • The ambiguity and availability of the information needed • The amount of time available to the analyst Analysts benefit from understanding the cognitive steps needed to amalgamate discrete bits of information into knowledge on which judgments and understanding can be based. The way our brains handle this task is often more intuitive and integrated than linear and scientific. Constantly incorporating critical thinking strategies into the analytic process is fundamental to producing insightful analysis. For this reason, it is a core Critical Thinking Steps Associated Chapters in This Book 1. Ask the right questions. 2 2. identify one's assumptions. 11 3. Reach out to other sources. 8 4. Evaluate the data for accuracy, relevance, and completeness. 9,10 5. Assess the data and form hypotheses. 12,13 6. Evaluate the hypotheses; look for conflicting data. 15 7. Draw conclusions. 16 8. Present your findings. 19 VARIATIONS ON A THEME These processes of planning, collecting information, assessing, communicating, and reviewing are central to the completion of any task. Organizations and authors portray the intelligence cycle in many different ways (see Figure 5.2). The components-—whether depicted in six or seven steps or as a cycle or process—are similar to most problem-solving methods. Some now pointedly refer to the intelligence cycle as the intelligence process to minimize any implication that the steps occur sequentially or exclusively. Different terms are used to fit particular circumstances or environments, such as the OODA (observe, orient, decide, act) decision-making processes designed by former US Air Force Colonel Charles G. Boyd for military actions and often cited as a model for security risk management.3 Scientific problem-solving, business problem-solving, and mathematical problem-solving methods parallel the critical thinking approach but employ different sets of data and timing. The steps may have slightly different names or be combined in slightly different ways. Figure 5.2 shows the similarities by giving similar actions the same degree of shading. The devil, of course, is in the details of the intellectual power brought to the process; it is the "Wow" and not just the "How." Analytic strategies should reflect the process appropriate for your organization or discipline, but analysts should focus on how to best employ the thinking skills to create products that make a difference and enable the reader to take action. For example, another of our colleagues advocates an analytic problem- {___T?: - irirtntn a:vvnat is rviy wiaiyut: Mpproauir O '■5 IE CO Q) x: v D) C > (D x: ie B B s e c c £L C8 to ._ a> O of O ------------. ) f. m E n 0» (U c .E 3 "3 S S "5 m a «> £: « S ot c > Ô m C' E v S S Q. O « "C w a E o u vi Ul 3 «1 u. ž E O) •if 1 (U 0 g * 1 o _ O) (S c 5f O) o en c Q 11 OJ -v j5 íl. I CO Q x: I O N "S c o c S C d) 8£ 2 g. 03 CC c o O F? § II a. uj 5 CO c o c5 o < c ■ a. TJ 3 < i "O -S "S D > a. o £ £ í ■í t iff? S J: ' O! >' x c J posing questions (see Figure 5.3).4 Forcing analysts to think through all aspects of a problem using a questioning method can prove a powerful tool in helping them to sort out the fundamental structure of a product—and their analysis. The method guides analysts through several stages including analysis (dividing the problem), synthesis (grouping the problem), information selection (identifying the "So What"), and argumentation (forging the line of analysis). CATEGORIES OF ANALYTIC ARGUMENTS "Habit 2: Begin With the End in Mind." —Stephen R. Covey Seven Habits.of Highly Effective People The specific analytic strategy and how much time is allocated to each step will depend on the type of analysis you are producing and what level of sophistication is required to answer the question. In the years the authors have been teaching, mentoring, and managing analysts, we find this is one of the hardest concepts to communicate. The depiction of types of analytic arguments in Figure 5.4 is based on an amalgamation of hierarchies and conceptual categories from knowledge FIGURE 53 Problem Solving With the Questions Method One Problem-Solving Method • Identify the problem, o Asking questions • Divide the problem, o Asking questions • Question and define the problem, o Asking questions • Group the problem, o Asking questions • Eliminate the obvious and focus on the pertinent, o Asking questions • Answer the questions in a systematic manner. o Stating, defining, supporting, clarifying o Connecting ideas, words, and themes through transitions or linguistic chain links FIGURE 5.4 The Analytic Spectrum FIGURE 5.5 SATs That Describe 3 u i Estimative Evaluative <^ Analysis Explanatory <^ Analysis fT^YMt-." Descriptive Analysis / 1 ' Happens Analysis , " What Does Next? Why* What Does ft Mean? Forecasts Evaluates and Judges Identifier CdUbe and Effect Data-Dnven Thought Process Concept-Driven Source: Copyright 2016 Pherson Associates, LLC. All Rights Reserved. management, intelligence, cognitive psychology, and rhetoric and argumentation.6 In making the transition from novice to journeyman and from journeyman to expert, analysts must understand where each type of analytic product fits in the spectrum. Many in the law enforcement and homeland security analytic communities seek to move beyond traditional data-, fact-, and case-based research and display to strategic and future-oriented frames. One way to graphically display this range of analytic endeavor is to array the categories and required thinking skills on scales depicting value (reactive to proactive) and complexity (data-driven or simpler to concept-driven or more complex).7 The skills are directly related to lines of analytic argumentation (see Chapter 12). • Descriptive analysis reports or summarizes what is known about situations, people, places, or objects (see Figure 5.5). It identifies what is valid or worth noting about the Who, What, How, When, and Where and organizes the data in a way that is easy to comprehend and recall. Sherman Kent, the Yale professor whose foundational work on intelligence analysis in the 1950s earned him the title of "the Father of Intelligence Analysis," identified basic intelligence (background information on nations and issues) and current intelligence WHAT STRUCTURED ANALYTIC TECHNIQUES APPLY? • Techniques that summarize include: o Chronologies & Timelines o Matrices o Sorting o Starbursting o Venn Analysis • Techniques that generalize include: o Resorting, Scoring, & Prioritizing o Minds Maps o Link Charts (reporting on events and items of immediate interest) as examples of descriptive analysis. In law enforcement, bulletins, spot reports, BOLOs ("Be on the lookout"), and analyst case support are examples of descriptive analysis. Research papers or assessments that clump data and reports without a useful judgment or "So What" and without providing the client an analytic line of argument are descriptive analyses masquerading as higher-order products. If an annual assessment consolidates reports or incidents relating to crimes committed with guns or activities of foreign intelligence services, it is basic intelligence. If it discusses differences over time, highlights changes, or addresses the significance of the events in a broader context, then it becomes explanatory or evaluative. Most analysts learn the craft of analytic presentation by writing descriptive products that allow them to become familiar with their issues and data sources while they learn the substantive and professional operating environments. All levels of analysis require careful attention to identify and overcome mindsets, challenge thinking, and make good use of colleagues for collaboration. The analytic techniques used at this level involve source selection and assessment and organization at the low end, mostly involving decomposition and visualization.8 Current intelligence has much in common with journalism, which also ascribes to the goals of timeliness, accuracy, and relevance.9 • ^Explanatory analysis probes the reason or cause of a situation, getting at why it has developed or is transpiring in the way portrayed by valid sources (see Figure 5.6). At this level, analysts do not just organize and report interesting information but must use argumentation to give context for the FIGURE 5.6 SATs That Explain FIGURE 5.7 SATs That Evaluate WHAT STRUCTURED ANALYTIC TECHNIQUES APPLY? • Explanatory techniques include: o Hypothesis Generation o Analysis of Competing Hypotheses o Structured Analogies o Delphi Method o Argument Mapping facts, judgments, and observations about patterns or changes in behavior. Target analysis, a subset of intelligence analysis that focuses on discovering the identities and vulnerabilities of key intelligence objectives, usually falls into this category. Explanatory analysis of complex events may be the basis for intelligence assessments that aim to make sense of complex but ambiguous information and situations, such as economic disruptions or changes in crime patterns. Explanatory analysis is also often included in current intelligence products to provide a rationale for a recent trend, such as increasing violence in a locale or potential test preparations around a missile site. Explanatory analysis is buttressed by techniques for manipulating and displaying information, then drawing logical conclusions that add value to the evidence and create knowledge through the derivation of novel explanations. The analytic techniques used in explanatory analysis are mostly drawn from the families of decomposition and visualization and hypothesis generation and testing.10 The quality and accuracy of the conclusions depend on analysts' ability to apply expertise (their own and others) to the data, generate and test a variety of hypotheses, and identify diagnostic data to determine most likely explanations. •„, Evaluative analysis examines the significance of a problem or topic as it relates to clients' interests, using logic to interpret and make judgments .about, various values.or. meanings behind the data (see Figure 5.7). All of the previously mentioned analytic skills and techniques are used in evaluative analysis, but the distinction is primarily in the structure, data selected, and argumentation of the product. This may entail evaluating the nature of a situation (Is it a military exercise or an attack?), the quality of a course of action (Will the immigration policy have a positive or negative effect on the border?), extent of a problem (Will falling economic indicators precipitate a crisis?), or significance of a situation (Do decision makers need to pay attention now rather than next year?). WHAT STRUCTURED ANALYTIC TECHNIQUES APPLY? • Evaluative tech n iques i ncl ude: o Cross-Impact Matrix o Key Assumptions Check o indicators o indicators Validator® o Premortem Analysis o Structured Self-Critique o Deception Detection Most evaluative analysis will be written in the form of an assessment. Product success depends not only on the quality of your argumentation and judgments but on how well you have defined the question and whether it holds current or pressing interest for your clients. The structured analytic techniques most often used in evaluative analysis come from the families of idea generation, hypothesis generation and testing, assessment of cause and effect, and challenge analysis.11 • Estimative analysis looks to the future, asking what might happen next and proactively anticipating courses of action that decision makers may take in response to potential stimuli (see Figure 5.8). Estimative analysis by definition is carried by its underlying framework of drivers, influences, and assumptions in the absence of hard data. Forecasts are based on analysts' experience, knowledge, and strategies for modeling evidence that include scenarios and the full range of structured analytic techniques to instill rigor, spur imagination, and challenge mindsets. FIGURE 5.8 SATs That Estimate f | WHAT STRUCTURED ANALYTIC TECHNIQUES APPLY? j j • Estimative techniques include: I o Scenarios Analysis J o Quadrant Crunching™ | o What If? Analysis I o High impact/Low Probability Analysis j o Red Hat Analysis I .......... , . ____________.,„,.,-™-~-:-.-.......-*— 54 PART I: HOW DO t GET STARTED? CHAPTER 5: What Is My Anaiytic Approach? 55 J The authors Have a strong preference for the terms estimative and forecast or foresight analysis rather than formulations using the word predict, which imply point solutions that may be appropriate for scientific experimentation but are fraught with danger in assessing developments in the real world. Analysts are not handed a crystal bail at the start of their careers and should not be forced to promise what they cannot realistically deliver consistently and reliably. .The job of an analyst is to provide accurate depictions of the range of potential events so that no matter what happens, clients are not surprised. The key to success for an estimative or strategic foresight analyst is to imagine and portray the range of realistic scenarios, what decision makers might observe as .the future is unveiled, and the implications of alternatives and choices available" , to deal with those futures. Structured analytic techniques employed for estimative analysis include idea generation, scenarios analysis, assessment of cause and effect, and challenge analysis.12 Warning analysis is a critical subset of estimative analysis. The primary challenge for the warning analyst is to break free of established analytic mindsets and identify key assumptions that have ceased being valid. Analysts rarely succeed at this task unless they employ structured techniques like Quadrant Crunching™ or Structured Brainstorming that help them reframe an issue or think about it more creatively. Hindsight bias makes analysts prone to overestimate the accuracy and significance of their products about the future and clients prone to underestimate the value of those products on their perspectives and decisions. Analysts can mitigate this through strong analytic strategies and active client service; this provides a rich and consistent line of analytic products that support the ongoing decision-making processes. COGNITIVE BIAS AND INTUITIVE TRAPS IN ANALYSIS Cognitive biases are mental errors caused by the brain's simplified information-processing strategies. Some heuristics or experience-based techniques that generate a quick solution can save analysts time but may inject bias into the analysis. Potential causes of such bias include professional experience leading to an ingrained mental mindset, training or education, the nature of one's upbringing, type of personality, past experiences, or personal equity in a given situation. Hundreds of biases have been described in the academic literature using a wide variety of terms. The authors have identified a set of thirteen heuristics and cognitive biases that intelligence analysts are most likely to experience (see Figure 5.9). These cognitive biases are well documented in the professional literature but do not cover all of the mental mistakes that analysts make. Additional research on the topic by the authors has generated a second list of eighteen intuitive traps that are common, everyday errors analysts make when evaluating evidence, describing cause and effect, estimating probabilities, and evaluating FIGURE 5.9 Glossary of Heuristics and Cognitive Biases Selected heuristics that—when misapplied—can impede analytic thinking: • Anchoring effect: Accepting a given value of something unknown as a proper starting point for generating an assessment. • Associative memory: Predicting rare events based on weak evidence or evidence that easily comes to mind. • Availability heuristic: Judging the frequency of an event or category by the ease with which instances of this comes to mind. • Desire for coherence and uncertainty reduction: Seeing patterns in random events as systematic and part of a coherent world. • Groupthink; Choosing the option that the majority of the group agrees with or ignoring conflicts within the group due to a desire for consensus. • Mental shotgun: Lacking precision and control while making assessments continuously; providing quick and easy answers to difficult questions. • Premature closure: Stopping the search for a cause when a seemingly satisfactory answer is found before sufficient information can be collected and proper analysis can be performed. • Satisficing: Selecting the first answer that appears "good enough." Selected cognitive biases that can impede analytic thinking: • Confirmation bias: Seeking only that information that is consistent with the lead hypothesis, judgment, or conclusion. • Evidence acceptance bias: Accepting data as true unless it was immediately rejected when first reviewed. Focusing more on the coherence of the story than the reliability of the underlying data. • Hindsight bias: Claiming the key items of information, events, drivers, forces, or factors that actually shaped a future outcome could have been easily identified. • Mirror imaging: Assuming that others will act the same as we would, given similar circumstances. • Vividness bias: Focusing attention on one vivid scenario while other possibilities or potential alternative hypotheses are ignored. Source: Copyright 2016 Qlobalytics, LLC. All Rights Reserved. intelligence reporting (see Figure 5.10). Some of the classic intuitive traps analysts encounter are their tendencies to do the following: • Ignore information when they lack a category for that bit of information • Discount facts that do not support the analysis • Overstate conclusions when only a few data points are consistent • Fail to change the analysis when confronted with mounting contradictions • Assume the present (or the future) is like the past The future is plural.". . .^Reter Schwarte: A'-ithor of The. An of the long Vim. The role of structured analytic techniques is to help analysts overcome, avoid, or at least mitigate the impact of these cognitive biases and intuitive traps. Structured techniques spur analysts to question their intuitive judgments and ...... think more rigorously about difficult problems. The process by which an analytic conclusion is developed is more transparent and therefore easier for a client to accept than one based solely on traditional intuitive analysis. The use of structured techniques and other critical thinking skills also saves time by expediting review and thereby compressing the production process. 510 Glossary öf Intuitive Traps > Assuming inevitability: Assuming that an event was more certain to occur than actually was the case. Also referred to as the illusion of inevitability. Assuming a single solution: Thinking in terms of only one likely (and predictable) outcome instead of acknowledging that "the future is plural" and several possible outcomes should be considered. Confusing causality and correlation: Inferring causality inappropriately; assuming that correlation implies causation. Also referred to as perceiving cause and effect. Expecting marginal change: Focusing on a narrow range of alternatives representing marginal, not radical, change. Favoring firsthand Information: Allowing information we receive directly to have more impact that what we learn or are told secondhand. Ignoring the absence of information; Not addressing the impact of the absence of information on analytic conclusions. Ignoring base rate probabilities: Failing to accurately assess the likelihood of an event when faced with statistical facts and ignoring prior probabilities or base rates. • ignoring inconsistent evidence: Discarding or ignoring information that is inconsistent with what the analyst expects to see. • Judging by emotion: Accepting or rejecting what another group member says because the analyst likes or dislikes everything about that person. Also referred to as the halo effect. • Lacking sufficient bins: Failing to remember or factor something into the analysis because the analyst lacks an appropriate category or "bin" for that item of information. « Misstating probabilities: Miscommunicating or misperceiving estimates of subjective probability (most likely, could, probable). • Overestimating probability: Overestimating the probability of multiple independent events occurring in order for an event or attack to take place. • OverinterpretSng small samples: Overdrawing conclusions from a small sample of data that is consistent. • Overrating behavioral factors: Overrating the role of internal determinants of behavior (personality, attitudes, beliefs) and underestimating the importance of externa! or situational factors (constraints, forces, incentives). Often referred to as fundamental attribution error, • Presuming patterns: Believing that actions are the result of centralized planning or direction and finding patterns where they do not exist. • Projecting past experiences: Assuming the same dynamic is in play when something seems to accord with an analyst's past experiences. • Rejecting evidence: Continuing to hold to an analytic judgment when confronted with a mounting list of evidence that contradicts the initial conclusion. • Relying on first impressions: Giving too much weight to first impressions or initial data, especially if they attract our attention and seem important at the time. Source; Copyright 2016 Globalytica, LLC. Al! Rights Reserved. SYSTEM 1 AND SYSTEM 2 THINKING Intelligence analysts employ a wide array of approaches for describing the various ways they think about problems. Researchers and others who study intelligence analysis write about the relative benefits of qualitative versus quantitative techniques or the use of intuitive versus scientific or empirical methods. They debate passionately and periodically whether intelligence analysis is an art or a science. The National Academies of Science/National Research Council Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security has recommended testing analytic techniques to ensure they are more scientific,13 We are among the first to support testing and research to improve analytic methods, but we have worked long enough in the glare of crisis and with time and resource constraints to understand that intelligence analysis will never be synonymous with scientific research. We need to embrace the art as well as the science of doing good analysis, using the entire range of human cognitive abilities to illuminate difficult issues. When the US Institute of Peace tackled the question of whether quantitative models based on algorithms developed with historical data should replace qualitative models based on analysts' knowledge and experience, the report concluded that "the best results for early warning are most likely obtained by the judicious combination of quantitative analysis based on forecasting models with qualitative analysis that rests on explicit causal relationships and precise forecasts of its own" and warns decision makers to "insist" on getting the results from multiple methods.14 Over the past two decades, substantial research has been done on the cognitive processes involved in human judgment. One result has been the emergence of Dual Process theory, which posits two systems of thinking: System 1, which is intuitive, fast, efficient, and often unconscious, and System 2, which is analytic, slow, deliberate, and conscious.15 This distinction is best described in Daniel Kahnemans book, Thinking Fast and Slow.16 Analysts traditionally have relied mostly on System 1 thinking—intuitive judgment—when constructing their lines of analysis, drawing on evidentiary reasoning, historical case studies, and reasoning by analogy. System 2 thinking posits four distinct methodological approaches to analysis that vary according to the following criteria-.17 • The data to be manipulated are available for use in the analysis or partially or entirely unknown. • The analytic techniques to be used are qualitative or quantitative. Each approach requires distinct expertise, some of which analysts will have learned in undergraduate studies, but some of which is pursued in specialized and advanced programs (see Figure 5.11). Analysts should strive to become familiar with all types of analytic methods and proficient with several,18 • Expert judgment helps the analyst the analyst structure mostly known data to generate qualitative evaluations. Analysts should learn many of these skills in liberal arts and social science academic programs, often in conjunction with geographic area or language expertise. • Empirical analysis, statistics, and data-based computer models apply quantitative techniques to known data. Analysts should have sufficient FIGURE 5.11 Matching Problem-Solving Techniques to Types System 1 Thinking System 2 Thinking Known and Unknown Known Data Data Intuitive Judgment r»'.iTtÄiariat-Arialysis-: Qualitative expert Judgment ■ Getting started • Vetting infofmaBon . Making the ca&e * Conveying the message Structured Analytic Techniques « Innovation • Diagnostic « Strategic Foresight Quantitative Rnpmenl Analysis computer tools techniques Quasl-Quanhtemve Analyst«. » Computer basso generated data Source: Copyright 2016 Globalytica, LLC. Ali Rights Reserved. grounding in mathematics and basic statistics to organize and aggregate quantitative data. Structured analytic techniques provide a means to instill more of the rigor of scientific inquiry into qualitative processes involving ambiguous situations where all the facts are not and may never be known. By externalizing the analysts thinking step by step, it can be examined, questioned, and compared in coEaboration with others to overcome mindsets, creatively anticipate the potential for disruptive change, and focus on information that helps distinguish one hypothesis or developing scenario from another. Over the past fifteen years, these techniques have become broadly taught and used by intelligence communities across the globe and increasingly in academia, business consulting, and private industry. Quasi-quantitative analysis is performed by computer-based models that attempt to deal with the unknown or unknowable by incorporating expert estimates into their algorithms. Special procedures designed to do this consistently and logically include Bayesian networks and simulation techniques. Designing and building these models is a specialized field; translating the insights from these models is often best accomplished in conjunction with qualitative methods. PLANNING YOUR ANALYTIC PROJECT Whether you are preparing to write about current events, interpret a newly collected set of data, explore emerging trends, or look, into the future, your a plan for research and production showcases what you are learning and the quality of your analytic skills and tradecraft. Here are some important things to keep in mind: a • Write down your plan and change it as needed rather than researching without a strategy, planj or structure. Your plan and your products are . the yardstick by which your analysis will be measured. Art explicit strategy becomes particularly critical when you are engaged in a lengthy, high-profile, or, rriulti-organization project. "'; • Plan for multiple products to highlight your progress. Research aids _ j can provide valuable way stations as part of the process for producing { a longer analytic product. Short pieces on new developments help you ~ develop the expertise needed to produce longer piapers on difficult, j evolving, or more complex issues. \ • Keep a list of your key assumptions, intelligence questions, and multi- '} pie hypotheses to be explored. Keep in mind that you are looking for j evidence to disprove or eliminate a hypothesis. Review these lists as f you.complete the final draft of your paper or presentation. j • Search for the best information in the time you have available. Keep .| the ratio of time spent in research and production in balance. This is f particularly useful if you can contact experts in government, aca- I demia, or private industry, or levy requirements on field collectors or ,.| external partners rather than being a prisoner of your inbox. j • Beware of the most common analytic pitfalls: ' j o Not defining the problem or issue correctly '1 o Jumping to a solution before analyzing the problem J o Not involving people who know most about the problem J o Not having an open mind § o Using the wrong criteria :a| o Mirror imaging or assuming others think or act as you would I o Assuming actors have more control or power than they do I WRITING A CONCEPT PAPER I Concept papers help you tighten your focus before you start researching and ff writingjSerying as a research design contract between you, your boss, andyour ?M collaborators on how the.paper will be produced. They are not product out- # lines and do not contain your judgments {see Figure 5.12). In contrast, a Terms '§, of Reference (TOR) paper (see Chapter 4) establishes expectations with your CHAPTER S: What is My Analytic Approach? 61 j BGURES.12 Concept Paper Outline 1. Working title 2. Author(s) 3. Type of product 4. Scope: Brief statement of the thrust of the paper, why it is being written, what it is intended to accomplish, key questions it wiil answer. 5. Audience: Key ciient(s) and key issues they want addressed. 6. Data gathering: Best sources of information, potential gaps, collection requirements, data analysis methods. 7. Outline: Preliminary ordering of topics to be covered. 8. Deadlines: Timeline for producing draft and publishing paper. 9. Contributions: Need for input from other offices or experts and tasks they will perform. 10. Resource requirements: Estimated overall level of effort for principal drafter and contributors. 11. Methodological support: List of analytic techniques that could enhance the analysis and work plan for employing them. 12. Graphics: Preliminary list of figures, including text boxes, that could help teii the story. 13. Collaboration: Potential sources of expertise both within and outside of the organization. 14. Coordination: Who must coordinate the draft before publication? reviewers, coordinators, and perhaps your clients on the substantive scope of the draft and when they will receive it. A good concept paper should, at a minimum, accomplish all of the following tasks: • Identify the key questions • Identify the client or customer set and their requirements • Lay out a research plan • Identify key sources and methods • Outline the line of argument • Indicate needed resources • Propose a timeline for completion CHAPTER 5: What Is My Analytic Approach? Make sure everyone involved in the project signs off on the concept paper before you start work. You now have something to refer back to when others adopt different views on the subject, you change your analysis or run into resource problems, managers or clients change their minds about what they want or when they want it, or some other issue takes priority. CONSIDERING RESEARCH METHODS Research Purpose The research phase involves both searching for the information and processing it to determine what it means with regard to the questions you are seeking to answer. Research methods are the ways—tools, techniques, and processes—you use to collect your data. Some use the terms methods and methodologies interchangeably, but they are at different levels of abstraction. You select appropriate methods—the "how-to"-—from the range of research methodologies—the "why to choose this method"— to discover insights and gain knowledge. Quantitative research methodologies, for instance, include statistical methods. Qualitative methodologies include methods such as interviewing and focus groups. The methodology, of course, is based on the reason for the analysis and the type of argument you anticipate making in your product (see Figure 5.4), Depending upon your discipline and issue, your research might have one or more of these purposes: • Exploratory research seeks to define and understand the key components or variables and how they relate to one another. It might involve a literature search, Internet research, or expert interviews to build the conceptual framework on which your work on this issue will be based. It is the first step in attacking any new issue or project, whether in management, science, or social disciplines, but analysts need to be disciplined in their searches and not allow exploratory research to eat up valuable time needed for the rest of the analytic process. Exploratory research establishes the intellectual footing that will help you fine-tune your questions, suggest methods or areas for investigation as part of this study or a following one, propose products for the near term or longer term, or start to pinpoint the criteria or drivers for conceptual types of analysis for which data does not readily exist (see Figure 5.2). » Descriptive research aims to provide an accurate description of the issue and its key components; it fills in the features-—the Who, What, How When, and Where that comprise Descriptive Analysis on the Analytic Spectrum. What demographic or economic data will help us understand population movements in and out of urban areas? Who is managing and providing J expert advice to political candidates and how does that play out in the candi- | dates' platforms and communications strategies? I • Explanatory research probes the cause/effect relationships of the com- I ponents, seeking the data that enable testing of hypotheses to understand how 1 strongly variables are correlated and whether some are causes of others. | Empirical research refers to experiments carried out to test hypotheses based | on observations using physical senses or other sensing mechanisms. I • Evaluative research enables the construction of analytic arguments by | providing evidence to support judgments and assessments about fast-moving | events or recommendations for policy changes. Are risks increasing to US I travelers to France and how can those risks be mitigated? What are the best I strategies for improving the quality of analysis in our organization? Researchers j must be careful to explore the full set of variables and find the best descriptive I data, challenging the relationships among the variables to make solid argu- «| ments and not solutions with cherry-picked reasons to support it. * QUANTITATIVE VS. QUALITATIVE t Researchers commonly distinguish between quantitative and qualitative methods, but many use a combination of both in their studies. The easiest way " j to distinguish between the two is that quantitative methods deal with numerical data and qualitative methods with words or images. Many projects use 1 both methods—for instance, immigration data combined with interviews or - case studies give a richer picture of the scope and detail of challenges on the southwestern US border. * Quantitative. If the data you are collecting involves entities (people, ~| places, things, events), then you can count and measure them using statistical methods to collate, interpret, compare, and verify your num~ J bers. Mechanisms for collecting quantitative data include document J reviews for numeric information, surveys, structured interviews and s observations, and statistical applications. o Benefits: Data can be gathered from large numbers of participants, compared across groups, and generalized to broader populations. 4 Numerical information gives the impression of concreteness and j is easy to use with statistical techniques to determine relations between variables. f o Cautions: Quantitative methods provide an evidence-based per- spective that appears objective and factual, but statistics and I graphical representations can be misleading (see Chapter 18) or :\ just plain wrong and have difficulty recognizing and accounting for * new phenomena. Quantitative data intended to measure opinions, ^ beliefs, or preferences should be collected following established I w_rwru i: nuvv uu I ijfcl ilAMfcUi' best practices, but as they are estimates provided by humans in response to questions posed by humans, they are more subjective than the numbers might appear on the surface. • Qualitative. Qualitative methods seek to get at meanings, focusing on and describing subjects' experiences, perspectives, attitudes, beliefs, and values that might explain their behaviors and doing so in ways -; that cannot be conveyed in numbers. Mechanisms for collecting qualitative data include document exploitation for themes; observation of participant activity; interviews, surveys, or questionnaires that are unstructured or semi-structured using open-ended questions; focus >; groups; and case studies. 1 o Benefits: Qualitative methods do not have response limitations so s can provide richer content, relate anecdotal information, uncover 4 new phenomena, and aid in deeper, more nuanced understanding. «i Verbal information can be captured in digital form to take advan- -\ tage of the strengths of quantitative techniques. H o Cautions: Verbal input can be difficult to generalize and to use | to assess relationships between variables. It cannot be directly | assessed by statistical methods. \ W, L. Neuman in one of the most highly regarded textbooks on research * methods offers detailed explanations on how to undertake specific methods i{ and provides a useful table regarding the differences between qualitative and | quantitative research methods (see Figure 5.13).19 j 1 CHOOSING YOUR METHODS j : What is the best means to get information that will help you answer the ques- -| tions you have posed? For each question, you can brainstorm the possible sources of information and means of collecting it (see Figure 5.14). Work on FIGURE 5.13 Comparing Quantitative and Qualitative Research Methods Quantitative Qualitative Objective is to test hypotheses that the researcher generates. Objective is to discover and encapsulate meanings once the researcher becomes immersed in the data. Concepts are in the form of distinct variables. Concepts tend to be in the form of themes, motifs, generalizations, and taxonomies. However, the objective is still to generate concepts. CHAPTER 5: What Is My Analytic Approach? 651 Quantitative Qualitative Measures are systematically created before data collection and are standardized as far as possible; e.g., measures of job satisfaction. Measures are more specific and may be specific to the individual setting or researcher; e.g., a specific scheme of values. Data are in the form of numbers from precise measurement. Data are in the form of words from documents, observations, and transcripts. However, quantification is still used in qualitative research. Theory is largely causal and is deductive. Theory can be causal or noncausal and is often inductive. Procedures are standard, and replication is assumed. Research procedures are particular, and replication is difficult. Analysis proceeds by using statistics, tables, or charts and discussing how they relate to hypotheses. i Analysis proceeds by extracting themes or generalizations from evidence and organizing data to present a coherent, consistent picture. These generalizations can then be used to generate hypotheses. Source; W. I. Neuman, Social Research Methods: Qualitative and Quantitative Approaches, 7th ed. (Boston: Ailyn and Bacon, 2014). the assumption that you have multiple ways of accessing information, some of which will be more feasible within your time and resource limitations and some of which will be more essential to ensuring the quality of your study. Your ultimate plan will be a series of tradeoffs to maximize your available time and capabilities. FIGURE 5.14 Research Method Planning Matrix i Research Method Planning Matrix .......—j Research Questions Evidence Needed to Answer Questions Methods to Gather Evidence Difficulty Priority Quantitative Qualitative Source: Copyright 2016 Globaiytica, LLC. All Rights Reserved. CHAPTER 5: What is My Analytic Approach? As you make your trade-offs for collection activity, you will understand in which order you should: • Identify and check your assumptions • Outline the Who, Where, What, and When of the collection method • Identify specific collection procedures or statistical tests to be followed • Specify requirements for materials and other logistical and personnel support STANDARDS FOR RESEARCH METHODS QUALITY Research standards have been recommended or defined for some disciplines or specialized areas of research, but not generalized. In our opinion, the most important rule is to design your research methods to withstand the scrutiny of your peers, supervisors, and reviewers who might support or criticize your work. • Ensure your research methods clearly address the questions you are answering. • Justify your scope and methods based on solid exploratory research of the context and existing literature or datasets. • Make your methods and processes transparent to facilitate external review. • Identify clearly how you conceptualized representative samples and implemented quantitative measurements. • Evaluate the quality of all sources and the currency and accuracy of all data. • Avoid asking leading questions in surveys, interviews, or focus groups. • Test any research instruments before using them in a live research setting. • Obtain appropriate permissions of any human subjects or interviewees, protecting their personal information and ensuring they are fully informed of the research purpose and how their input will be used. • Check your key assumptions and potential impact of bias. • Buttress findings or conclusions when possible with multiple sources of information. • Consider alternative explanations for findings or conclusions. • Keep careful and accurate notes, adhering to production standards for research reports. UNDERSTANDING THE LIMITS OF RESEARCH The next section will go into more detail on types and evaluation of information sources. We cannot mention frequently enough, however, that all the sources, research, and hypothesis testing in the world will not yield answers to today's most difficult and relevant questions. Good research and good data are necessary but not sufficient to stay on top of the most pressing issues in our changing world or to find solutions to our most serious problems. Analysts must learn the critical thinking skills, modeling, and structured analytic techniques to reason through problems for which they do not have—nor will ever have—have all the data they need to come up with certain solutions. • An explicitly defined strategy builds credibility for the product by providing < a baseline the analyst can use to communicate progress and make adjustments to deal with changing needs and deadlines. It guides the analysts' ' research and analysis by sorting the wheat from the chaff and identifying: information gaps and other areas in which to create knowledge. • Critical thinking involves asking the right questions, identifying the analyst's own assumptions, reaching out for sources, assessing the data for quality and pertinence to answering the question, forming and evaluating hypotheses, and drawing conclusions. The process of planning, collecting, assessing, and concluding is similar to most problem-solving processes. • The strategy will depend upon whether the analysts analytic argument will be descriptive, explanatory, evaluative, or estimative; what kind of data—if any—exist regarding the question; and whether to use qualitative or quantitative methods to exploit these data. • Analysts should make a plan; write down their plan, assumptions, questions, and hypotheses; and monitor whether any of these change while drafting the paper. • Concept papers are a researh design contract between the author, the supervisor, and other analysts that establishes the scope and timeline for the product. They focus attention on predefined stages and keep the analyst from veering off into unproductive or time-consuming areas of research. • Before starting work, analysts should always check that the product meets the threshold for being new, different, or of particular or pressing interest to the client. • Research methods can be for the purposes of exploring the issue, describing the components and dynamics, explaining relationships, and evaluating evidence to support analytic arguments. • Quantitative methods allow numerical manipulation of data; qualitative methods focus more on verbal nuances. • Research plans should be transparent, address the questions, follow best methodological practices, check for the impact of bias, consider multiple sources of information and alternative conclusions, and be well documented. • Critical thinking skills and structured analytic techniques help analysts reason their way through problems for which they do not have—nor will ever have—all the data they need for deductive conclusions. CHAPTER 5: What is My Analytic Approach? Review Case Study III, "The End of the Era of Aircraft Carriers," and briefly answer the following questions: • How would you define the issue for a global strategist, naval commander, weapons designer, or a senior policy official? • What information is each client likely to request? • Where would you look for information on the new technologies and evolving naval strategies and how would you structure a research plan? • Name three cognitive biases and intuitive traps to which analysts working on this issue might fall victim. • Is your analytic argument descriptive, explanatory, evaluative, or estimative? Does sufficient data exist to address this issue? • How could a conference of experts help to answer the key questions? 6. Yogi Berra, When You Come to a Fork in the Road, Take It! Inspiration and Wisdom From One of Baseballs Greatest Heroes (New York: Hyperion, 2002), 53. We are indebted to Cynthia Storer for sharing this metaphor with us and have incorporated it into our teaching materials. Robert Coram Boyd, The Fighter Pilot Who Changed the Art of War (New York: Back Bay Books, 2002). We thank Frank Marsh for sharing this analytic problem-solving method with us in his comments on the outline for this book. Stephen R. Covey, Seven Habits of Highly Effective People (New York: Simon & Shuster,2013), 102. This categorization of types of analytic arguments is based on a wide range of academic and career experience but is specifically informed by Aristotle's Rhetoric; contributions of Sherman Kent as cited in Rob Johnston, "Foundations for Meta-Analysis: Developing a Taxonomy of Intelligence Analysis Variables," Studies in Intelligence 47, no. 3 (2003); Russell Ackoff as cited in Bellenger et al., "Data, Information, Knowledge, Wisdom'' 2004, www.systems-thinking.org/dikw/ dikw.htm; and David T. Moore, Species of Competencies for Intelligence Analysis (Washington, DC: Advanced Analysis Lab, National Security Agency, 2003). In the late 1970s, a study conducted for the US Army studying signals intelligence and imagery intelligence task processes concluded that "intelligence analysis is conceptually driven as opposed to data driven. What is critical is not just the data collected, but also what is added to those data interpreting them via conceptual models in the analysts store of knowledge." Robert V. Katter, Christine A. Montgomery, and John R. Thompson, "Human Processes in Intelligence Analysis: Phase I Overview," Research Report 1237 {Woodland Hills, CA: Operating Systems, Inc., December 1979). 8. See Heuer and Pherson, Structured Analytic Techniques, Chapter 4 on decomposition and visualization, 47-96. 9. Office of the Director of National Intelligence, "Intelligence Community Directive 203: Analytic Standards." See Heuer and Pherson, Structured Analytic Techniques, Chapter 4 on decomposition and visualization, 47-96 and Chapter 7 on hypothesis generation and testing, 165-201. See Heuer and Pherson, Structured Analytic Techniques, Chapter 5 on idea generation, 99-129; Chapter 7 on hypothesis generation and testing, 165-201; Chapter 8 on assessment of cause and effect, 205-230; and Chapter 9 on challenge analysis, 233-270. See Heuer and Pherson, Structured Analytic Techniques, Chapter 5 on idea generation, 99-129; Chapter 6 on scenarios analysis, 133-161; Chapter 8 on assessment of cause and effect, 205-230; and Chapter 9 on challenge analysis, 233-270. Committee on Behavioral and Social Science Research to Improve Intelligence Analysis for National Security; National Research Council, Intelligence Analysis for Tomorrow: Advances From the Behavioral and Social Sciences (Washington, DC: The National Academies Press, 2011), 84. Jack A. Goldstone, "Using Quantitative and Qualitative Models to Forecast Instability" US Institute of Peace Special Report, no. 204 (March 2008): 1. For further information on Dual Process theory, see the research by Jonathan Evans and Keith Prankish in Two Minds: Dual Processes and Beyond (Oxford, UK: Oxford University Press, 2009) and Pat Croskerry in "A Universal Model of Diagnostic Reasoning," Academic Medicine 84, no. 8 (2009). Daniel Kahneman, Thinking Fast and Slow (New York: Farrar, Straus and Giroux, 2011). 17. A more detailed description of System 1 and System 2 thinking and the four types of System 2 thinking can be found in Heuer and Pherson, Structured Analytic Techniques, 19-24. Examples of models that have been used in the US Intelligence Community to support empirical, quasi-quantitative, and structured forms of analysis are provided in Chapter 7 and Chapter 18. W. L. Neuman, Social Research Methods: Qualitative and Quantitative Approaches, 7th ed. (Boston: Allyn and Bacon, 2014). 10 11 12 13. 14 15 16. 18 19.