PDF | Structured analytic techniques (SATs) are intended to improve intelligence analysis by checking the two canonical sources of error. Structured Analytic Techniques for Intelligence Analysis- Heuer, Richards J. & Pherson, Randolph H - Ebook download as PDF File .pdf), Text File .txt) or read . for Improving Intelligence Analysis. Prepared by the This primer highlights structured analytic techniques—some widely used in the private.
|Language:||English, Spanish, Hindi|
|Distribution:||Free* [*Registration needed]|
This books (Structured Analytic Techniques for Intelligence Analysis [PDF]) Made by Richards J. Heuer About Books This book showcases. Title: eBOOK PDF Structured Analytic Techniques for Intelligence Analysis EPUB, Author: raynebarry, Name: eBOOK PDF Structured. Structured analytic techniques (SATs) are intended to improve intelligence analysis intelligence analysis by mitigating commonly observed cognitive biases in.
Analysts have also found that use of a structured process helps to depersonalize arguments when there are differences of opinion. Fortunately, todays technology and social networking programs make structured collaboration much easier than it has ever been in the past.
It is another matter altogether for analysts to learn to select, understand, and actually use structured analytic techniquesand to use them well. That is what this book is about.
From the several hundred techniques that might have been included here, we developed a taxonomy for a core group of fifty techniques that appear to be the most useful for the Intelligence Community, but also useful for those engaged in related analytic pursuits in academia, business, law enforcement, finance, and medicine. This list, however, is not static. It is expected to increase or decrease as new techniques are identified and others are tested and found wanting. Some training programs may have a need to boil down their list of techniques to the essentials required for one particular type of analysis.
No one list will meet everyones needs.
However, we hope that having one fairly comprehensive list and common terminology available to the growing community of analysts now employing structured analytic techniques will help to facilitate the discussion and use of these techniques in projects involving collaboration across organizational boundaries.
The most common criticism of structured analytic techniques is, I dont have enough time to use them. The experience of many analysts shows that this criticism is not justified.
Many techniques take very little time. Anything new does take some time to learn, but, once learned, the use of structured analytic techniques often saves analysts time. It can enable the individual analyst to work more efficiently, especially at the start of a project when the analyst may otherwise flounder a bit in trying to figure out how to proceed.
Structured techniques usually aid group processes by improving communication as well as enhancing the collection and interpretation of evidence. And in the end, a structured technique usually produces a product in which the reasoning behind the conclusions is more transparent and readily accepted than one derived from other methods.
This generally saves time by expediting review by supervisors and editors and thereby compressing the coordination process. In-depth interviews of fifteen Defense Intelligence Agency analysts concerning their use of Intellipedia found that whether or not an analyst contributed to Intellipedia was determined by a more profound rationale than being a member of a particular age group. Some older analysts contributed and some from the younger generation did not. The Defense Intelligence Agencys Knowledge Lab hypothesized that this depends in part on how analysts view their responsibility to the customer.
For example, if an analyst believes that the customer is better served when the uncertainty and diversity associated with knowledge is revealed to the customer, then the analyst is more likely to share and collaborate throughout the knowledge creation process. If the analyst believes that the customer is better served by being certain about what is delivered, then the analyst is more likely to wait until near the final stage of product delivery before sharing.
For the former, collaborative behavior is not constrained by a need for ownership. For the latter, collaborative behavior is constrained by ownership needs.
However, the origin of the concept goes back to the s, when the eminent teacher of intelligence analysis, Jack Davis, first began teaching and writing about what he called alternative analysis. In the mids some initial efforts were made to initiate the use of more alternative analytic techniques in the CIAs Directorate of Intelligence. Under the direction of Robert Gates, then CIA Deputy Director for Intelligence, analysts employed several new techniques to generate scenarios of dramatic political change, track political instability, and anticipate military coups.
Douglas MacEachin, Deputy Director for Intelligence from to , supported new standards for systematic and transparent analysis that helped pave the path to further change. David Jeremiahs postmortem analysis of the Intelligence Communitys failure to foresee Indias nuclear test, a U. The Jeremiah report specifically encouraged increased use of what it called red team analysis. When the Sherman Kent School for Intelligence Analysis at the CIA was created in to improve the effectiveness of intelligence analysis, John McLaughlin, then Deputy Director for Intelligence, tasked the school to consolidate techniques for doing what was then referred to as alternative analysis.
In response to McLaughlins tasking, the Kent School developed a compilation of techniques, and the CIAs Directorate of Intelligence started teaching these techniques in a class that later evolved into the Advanced Analytic Tools and Techniques Workshop.
The course was subsequently expanded to include analysts from the Defense Intelligence Agency and other elements of the Intelligence Community. Wisdom begins with the definition of terms.
Socrates, Greek philosopher The various investigative commissions that followed the surprise terrorist attacks of September 11, , and then the erroneous analysis of Iraqs possession of weapons of mass destruction, cranked up the pressure for more alternative approaches to intelligence analysis.
For example, the Intelligence Reform Act of assigned to the Director of National Intelligence responsibility for ensuring that, as appropriate, elements of the intelligence community conduct alternative analysis commonly referred to as red-team analysis of the information and conclusions in intelligence analysis. Over time, however, analysts who misunderstood or resisted this approach came to interpret alternative analysis as simply meaning an alternative to the normal way that analysis is done, implying that these alternative procedures are needed only occasionally in exceptional circumstances when an analysis is of critical importance.
Kent School instructors had to explain that the techniques are not alternatives to traditional analysis, but that they are central to good analysis and should be integrated into the normal routineinstilling rigor and structure into the analysts everyday work process. In , when the Kent School decided to update its training materials based on lessons learned during the previous several years, Randy Pherson and Roger Z. George were among the drafters.
There was a sense that the name alternative analysis was too limiting and not descriptive enough. At least a dozen different analytic techniques were all rolled into one term, so we decided to find a name that was more encompassing and suited this broad array of approaches to analysis. Roger George organized the techniques into three categories: diagnostic techniques, contrarian techniques, and imagination techniques.
The term structured analytic techniques became official in June , when updated training materials were formally approved. The Directorate of Intelligences senior management became a strong supporter of the structured analytic techniques and took active measures to facilitate and promote this approach.
The term is now used throughout the Intelligence Communityand increasingly in academia and allied intelligence services overseas. Senior analysts with whom we have spoken believe the term alternative analysis should be relegated to past history.
It is often misunderstood, and, even when understood correctly, it covers only part of what is now regarded as structured analytic techniques. One thing cannot be changed, however, in the absence of new legislation.
For purposes of compliance with this act of Congress, the DNI interprets the law as applying to both alternative analysis and structured analytic techniques.
Chapter 2 Building a Taxonomy defines the domain of structured analytic techniques by describing how it differs from three other major categories of intelligence analysis methodology. It presents a taxonomy with eight distinct categories of structured analytic techniques.
The categories are based on how each set of techniques contributes to better intelligence analysis. Chapter 3 Selecting Structured Techniques describes the criteria we used for selecting techniques for inclusion in this book, discusses which techniques might be learned first and used the most, and provides a guide for matching techniques to analyst needs.
Analysts using this guide answer twelve abbreviated questions about what the analyst wants or needs to do. An affirmative answer to any question directs the analyst to the appropriate chapter s where the analyst can quickly zero in on the most appropriate technique s. Chapters 4 through 11 each describe one taxonomic category of techniques, which taken together cover fifty different techniques.
Each of these chapters starts with a description of that particular category of techniques and how it helps to mitigate known cognitive limitations or pitfalls that cause analytic errors. It then provides a one-paragraph overview of each technique. This is followed by a detailed discussion of each technique including when to use it, value added, potential pitfalls when appropriate, the method, relationship to other techniques, and sources.
Readers who go through these eight chapters of techniques from start to finish may perceive some overlap. This repetition is for the convenience of those who use this book as a reference book and seek out individual sections or chapters. The reader seeking only an overview of the techniques as a whole can save time by reading the introduction to each technique chapter, the one-paragraph overview of each technique, and the full descriptions of only those specific techniques that pique the readers interest.
Highlights of the eight chapters of techniques are as follows: Chapter 4 Decomposition and Visualization covers the basics such as Checklists, Sorting, Ranking, Classification, several types of Mapping, Matrices, and Networks. Chapter 5 Idea Generation presents several types of brainstorming. That includes Nominal Group Technique, a form of brainstorming that has been used rarely in the Intelligence Community but should be used when there is concern that a brainstorming session might be dominated by a particularly aggressive analyst or constrained by the presence of a senior officer.
A Cross-Impact Matrix supports a group learning exercise about the relationships in a complex system. Chapter 6 Scenarios and Indicators covers three scenario techniques and the Indicators used to monitor which scenario seems to be developing. There is a new technique called the Indicators Validator developed by Randy Pherson.
Chapter 8 Assessment of Cause and Effect includes the widely used Key Assumptions Check and an important new technique called Structured Analogies, which comes from the literature on forecasting the future. Chapter 9 Challenge Analysis helps analysts break away from an established mental model to imagine a situation or problem from a different perspective. Two important new techniques developed by the authors, Premortem Analysis and Structured Self-Critique, give analytic teams viable ways to imagine how their own analysis might be wrong.
What if? Devils Advocacy, Red Team Analysis, and the Delphi Method can be used by management to actively seek alternative answers. Chapter 10 Conflict Management explains that confrontation between conflicting opinions is to be encouraged, but it must be managed so that it becomes a learning experience rather than an emotional battle. Two new techniques are introduced, including Adversarial Collaboration actually a family of techniques and an original approach to Structured Debate in which debaters refute the opposing argument rather than supporting their own.
Chapter 11 Decision Support includes four techniques, including Decision Matrix, that help managers, commanders, planners, and policymakers make choices or tradeoffs between competing goals, values, or preferences. This chapter also includes Richards Heuers new Complexity Manager technique. As previously noted, analysis in the U. Intelligence Community is now in a transitional stage from a mental activity performed predominantly by a sole analyst to a collaborative team or group activity.
Chapter 12, entitled Practitioners Guide to Collaboration, discusses, among other things, how to include in the analytic process the rapidly growing social networks of area and functional specialists who are geographically distributed throughout the Intelligence Community. It proposes that most analysis be done in two phases: a divergent analysis or creative phase with broad participation by a social network using a wiki, followed by a convergent analysis phase and final report done by a small analytic team.
How can we know that the use of structured analytic techniques does, in fact, provide the claimed benefits for intelligence analysis? As we discuss in chapter 13 Evaluation of Structured Analytic Techniques , there are two approaches to answering this questionlogical reasoning and empirical research.
The logical reasoning approach starts with the large body of psychological research on the limitations of human memory and perception and pitfalls in human thought processes. If a structured analytic technique is specifically intended to mitigate or avoid one of the proven problems in human thought processes, and that technique appears to be successful in doing so, that technique can be said to have face validity. The growing popularity of several of these techniques would argue that they are perceived by analystsand their customersas providing distinct added value in a number of different ways.
Although these are strong arguments, the Intelligence Community has not yet done the research or surveys needed to document them. Another approach to evaluation of these techniques is empirical testing.
This is often done by constructing experiments that compare analyses in which a specific technique is used with comparable analyses in which the technique is not used. Our research found that such testing done outside the Intelligence Community is generally of limited value, as the experimental conditions varied significantly from the conditions under which the same techniques are used in the Intelligence Community.
Chapter 13 proposes a broader approach to the validation of structured analytic techniques. It calls for structured interviews, observation, and surveys in addition to experiments conducted under conditions that closely simulate how these techniques are used in the Intelligence Community.
Chapter 13 also recommends formation of a separate organizational unit to conduct such research as well as other tasks to support the use of structured analytic techniques throughout the Intelligence Community. The final chapter, Vision of the Future, begins by using a new structured technique, Complexity Manager, to analyze the prospects for future growth in the use of structured analytic techniques during the next five years, until It identifies and discusses the interactions between ten variables that will either support or hinder the growing use of structured techniques during this time frame.
Given the possibility meaning from the accumulated data. Without structured In contrast to structured methods are those techniques that techniques, it is virtually impossible to understand the are not structured; frequently identified as "intuition.
Intuition is a feeling or instinct that does not operation alize because they Cannot be adequately quantified use demonstrative reasoning processes and cannot be or fully collected.. Because in many cases the variables are adequately explained by the analyst with the available so complex, countless, and incomplete, attempting to evidence. As a result, these unique value over intuition in that they can be easily proponents argue that qualitative intelligence analysis is taught to other analysts as a way to structure and balance an art because it is an intuitive process based on instinct, their analysis.
It is difficult, if not impossible, to teach an education, and experience. Intuition comes with experience. They argue that, although equates intuition with lack of structure, which is not it is impossible to consider every variable when conducting necessarily the case. And although much may be unknown, Folker has since modified his distinction between identifying what is known and analyzing it scientifically is structured methods and intuition, arguing that the "real an effective approach As he and intuition.
If qualitative intelligence analysis verifiable aspects of the process versus the invisible or non- is not exclusively an art nor a science, then it may best be verifiable aspects of the process rather lhan how structured considered a combination of both intuitive and scientific versus unstructured the analytic approach might be.
According to Folker, "a long-standing METHODS debate exists within the Intelligence Community about whether more should be invested in structured S ome outside observers with little experience in the methodologies to improve qualitative intelligence analysis. This perspective is best captured by New depending largely on subjective, intuitive judgment or a York Times columnist David Brooks in an article that science depending largely on structured, systematic criticized CIA's analysis as being too structured; how its analytic methods.
For example, most counter-terrorism and counter-narcotics analysts probably either used or knew The problem with this assumption, though, is that it is about using link charts or social network analysis as a way inaccurate.
Even though there are over analytic to display or represent connections between targets of methods that intelligence analysts could choose from,15 the interest and perhaps other analytic techniques were used in intelligence analysis process frequently involves intuition other parts of the agency. But the analysts I worked with rather than structured methods. As someone who worked at didn't use them at all.
In other words, intuition frequently CIA from to ,1 possess firsthand knowledge of dominated the actual process of intelligence analysis the kind of analytic approaches used at the time.
While I instead of more structured methodologies. So finding out who used in coming to that judgment, or making that process analytic methodologies was a particular interest of mine. No one I knew—except for maybe the And I ran into—and identified with—methodologists who economic analysts—used any form of structured analytic advocated greater rigor in analysis or those who process that was transparent to others.
No quantitative experimented with innovative techniques. Foremost among methods; no special software; no analysis of competing these were Jack Davis arid alternative analysis;19 Stanley hypotheses; not even link charts. Feder and FACTIONS at Policon and other kinds of modeling;20 the folks at Strategic Assessments Group who worked on scenario-based simulations;2' and Carole The intelligence analysis process Dumaine and the Global Futures Project, which at the time frequently involves intuition rather than tended to support future scenario work, among other things.
But for a variety of different reasons, including some good ones, few analysts embraced these techniques. As Folker Others have also made simitar observations. Folker has said observes, these approaches were "rejected by analysts that "structured methodologies are severely neglected" in because the scientific methods were thought to be too the intelligence community.
Pherson went on to say that scientifically analyze a problem," As he points out, "under "unfortunately, at CIA most of our energy and on-the-job the accelerating pressures of time, intelligence analysts feel training as analysts traditionally has gone into phase that structured analytical approaches are too three—learning how to capture the essence of our analysis cumbersome.
Substantial resources also have Hulnick, a professor at Boston University who spent 35 been devoted to phase one, but we remain woefully behind years as an intelligence analyst and manager, who has said what technology offers, And, until recently, we have that "methodolegists discovered that their new techniques largely ignored the need or used the excuse we don't have were unpopular with analysts because they had no time to the time to develop the necessary skills to ensure more absorb these systems in the face of tight deadlines.
In terms of accountability, Alan Schwartz points organizations —because it is likely that certain analytic out that it is natural for analysts to prefer "to have their American Intelligence Journal Page 9 Summer intuitive process unexamined.
Other analysts failed to adopt the techniques Folker agrees, pointing out that "structured thinking is because their benefits remained unproven to a skeptical radically at variance with the way in which the human audience. Most people are used to solving problems intuitively by trial and error. Breaking As Folker puts it, intelligence analysts "are not convinced this habit and establishing a new habit of thinking is an that structured methodologies will improve their extremely difficult task and probably the primary reason analysis.
This skepticism was heightened by the Fact that the value of many new But perhaps organizational culture can change if the use of techniques or methods was asserted or argued rather than structured methods can be demonstrated to provide clear proven through scientific evaluation.
And there, Richards Heuer has provided us with a little hope. In an article he wrote regarding the use of The value of many new techniques or quantitative techniques in intelligence analysis, Heuer methods was asserted or argued rather observed that "the initial attitude of country analysts toward our unconventional proposals typically ranged from than proven.
Equally typical, however, has been their post-project appraisal that the work was interesting and well worth doing. For example, Stanley Feder - a former CIA methodolegist- argues that the use of a So perhaps it is possible for structured analytic techniques specific analytic model produced more precise forecasts to improve analysis, although the fact that it has been than conventional intelligence analysis without sacrificing almost 30 years since Heuer made that observation—and accuracy.
Bat the same kinds of debates exist approach used by most analysts.
Another example is in other fields and can be used to shed light on ways to Folker's Master's thesis at the Joint Military Intelligence resolve thfe current debate.
College in which he showed that using one particular analytic method—structured hypothesis testing—produced better analysis than intuition. In the end, once there are impediments to incorporating structured methods into how people do analysis, those impediments become embedded in the organizational J ust as there is both an art and a science to intelligence analysis, there is also an art and a science to medicine as well as a similar discussion about the value of structured methods versus intuition.
The culture. Feder said that "despite the advantages of the comparison between intelligence analysis and medicine has models the vast majority of analysts do not use them. In , Sherman Kenl wrote that kind of systematic analysis does not fit into an "intelligence is a simple and self-evident thing. As an organizational culture that sees an "analyst" as someone activity, it is the pursuit of A certain kind of knowledge When a available information.
In contrast, people who use models doctor diagnosis an ailment—when almost anyone decides Summer Page 10 American Intelligence Journal upon a course of action—he usually does some preliminary the pressure on the doctor to make a diagnosis and come up intelligence work.
Sometimes doctors who work in large practices are limited In , historian Walter Laqueur examined the analogy at to only a few minutes with each patient This severely length, and argued that medicine is more an art than a restricts how much time they can spend during the science because the process of diagnosis entails the use of diagnostic process, which makes them resistant to using judgment as a means to address ambiguous signs and more structured methods.
And those same pressures symptoms. The doctor and the analyst have to collect and the value of structured methodologies—just as there is in evaluate the evidence about phenomena frequently not intelligence—and some people have criticized the amenable to direct observation. This is done on the basis of unstructured aspects of medical diagnosis—part of the art indications, signs, and symptoms.
In medicine, applies to intelligence. For example, according to medicine is an art and a tradecraft" and goes on to describe Jeffrey Pfeffer and Robert Sutton in an article in the how medicine relies on a base of science that has not yet Harvard Business Review, medical decisions are frequently been created to support intelligence analysis.
Medical School and author of the book "How Doctors Think"—has said that "conservatively about 15 percent of all people are misdiagnosed" because of "errors in In medicine, the structured approach to diagnosis is known thinking" primarily involving shortcuts.
For intelligence practitioners unfamiliar with wrong. And too often, we make what's called an anchoring this process, differential diagnosis is the medical equivalent mistake—we fix on that snap judgment" which he says to Analysis of Competing Hypotheses.
Richards Heuer has could be based on anything. He also describes other kinds said that his discussion of the 'diagnosticiry of evidence' in of cognitive errors that physicians make in the diagnostic his book "Psychology of Intelligence Analysis" was drawn process. And he points out that there is a movement afoot from the medical literature. Some have argued that medical diagnosis can be turned Good illustrations of differential diagnosis in practice can from an art into a science if physicians use structured be found in the Fox TV show "House"—which is about a methods to assist them in the process of making diagnoses.
The Frequently, however, physicians don't work through this trunk of the clinical decision tree is a patient's major process in an observable way. Rather, they do so via symptom or laboratory result, contained within a box. In many For example, a common symptom like "sore throat" would cases, the intuitive approach is used in medicine because of begin the algorithm, followed by a series of branches with American Intelligence Journal Page II Summer "yes" or "no1' questions about associated symptoms.
Is there There is a goal e. Are swollen lymph nodes associated with but two experts may go about acquiring information, the sore throat? Have other family members suffered from forming relationships, and drawing conclusions quite this symptom?
Similarly, a laboratory test like a throat differently— that is, they may use different structured culture for bacteria would appear farther down the trunk of approaches to reach the goal. Ultimately, following the way', even though there may be other perfectly valid ways branches to the end should lead to the correct diagnosis and to succeed. And doctrine must really be used as a guide, therapy. Groopman thinks that this more is not the only answer, it can still act as a mechanism to structured approach to diagnosis may be effective for inculcate best practices into new practitioners.
So how simple problems, but "quickly falls apart when doctors need might this discussion about the an and science of medicine to think outside the box, when symptoms are vague, or assist intelligence practitioners in improving practices in multiple and confusing, or when test results are inexact, In their own field? J ust as medicine is debating how to adjust its diagnostic doctrine to become less like an an and more like a science, the same discussion is taking place regarding intelligence analysis as well.
Still an open question, however, is whether there is an equivalent kind of doctrine for intelligence analysis. This discussion about the use of structured methods in medicine is taking place within an even larger context of In , Washington Platt observed that "in intelligence evidence-based medicine, which is a push to make medical production as yet we find little study of methods as such.
For example, an evidence-based approach to open-minded expert in that field nearly always leads to medicine would involve the implementation of evaluation worthwhile improvement.
A couple s that CIA's Product Evaluation Staff put together its of years ago, the New York Times published an article analytic tradecraft notes in an effort to formalize analytic describing how some radiologists had instituted a feedback doctrine, and in its Center for the Study of loop in their diagnosis of breast cancer which provided Intelligence released Richards Heuer's "Psychology of them with the opportunity to track cases over time, find out Intelligence Analysis" which provides the articulation, where their mistakes were, and use that knowledge to rationale, and description of the technique known as become better at diagnosing problems.
If these processes are found to be more effective, an But is there a mechanism to inculcate this doctrine into argument could be made to incorporate them into existing new practitioners? The medical profession has formal diagnostic doctrine.
But medicine also possesses educational optimal world it might. But doctrine alone may not provide requirements for entry into the profession. In medicine, a the answer to improving analytic quality across the board doctor spends a substantial amount of time in school to because there might not be a single best way to approach learn some of the knowledge that has been accumulated each specific problem. According to Hubal, "experts often thus far on medical science—in other words, how to don't agree.
Intelligence analysis has, best structured practices, as well as meta-data describing for the most part, been practiced more as a craft than a when, why, how to apply those practices"48—would enable profession; or as Hubal has said, "the educational and intelligence analysis as an occupation to learn and improve experiential processes have not yet been formalized.
According professionalization is necessary. Professionalization—or to Randy Pherson, there is "a growing realization that we the process of shifting a profession from less formal to have to instill more rigor into the analytic process and that more formal practices—is a process that relates to the we can no longer afford to say we are too busy to get it interaction of the art and the science in the field. If you right. Two of developed through training and experience, while a the more prominent are Rob Johnston's call for the professional is someone who has been formally educated in incorporation of methodologists into teams of substantive the 'science' of his or her field, and then uses that experts,50 and Tim Smith's suggestion that to increase the knowledge in an applied way.
Crafts rely primarily on the rigor of analysis the community should create "knowledge skill of the individual practitioner—which does not factories" explicitly using scientific methodology to improve very much from generation to generation without produce all-source finished intelligence. There is a growing consensus that some kind of improved We can no longer afford to say we are too professionalization process is necessary, but an open busy to get it right question is the mechanism for doing so.
Some of the pieces of this process are being put together at the national level by the staff of the Office of the Director of National But as analytic standards and doctrine are developed, Intelligence ODNI , as they try to define analytic questions will be raised about whether to promote certain competencies and standards for intelligence analysts across analytic techniques over others; whether to teach—or the many analytic disciplines of the intelligence mandate the use of—certain techniques such as ACH or community.
Others—such as myself—are advocates for a link analysis. For example, parts of the intelligence less hierarchical approach through the efforts of a community have been encouraging analysts to use more professional association, and a possible model could be the structured methods in their analysis.
As my co-author—Dr, Jonathan Clemente—has Additional courses are being introduced at the agency level. We've argued that the most effective way to training of all analysis managers in the core techniques,"5' professionalize intelligence analysis is through an over- In addition, his company is teaching structured analytic arching professional association modeled on the AMA.
Moore has written a intuitive plausibility but on solid evidence of what works. For infrastructure is created, it will be possible to gain example, the software version of Richards Heuer's Analysis knowledge—scientifically—about the effectiveness of of Competing Hypotheses5" is being adapted by CIA to different approaches,62 rather than rely primarily on accommodate multiple users.