Skip to content. | Skip to navigation

Central Intelligence Agency
The Work of a Nation. The Center of Intelligence

Kent Center Occasional Papers

CIA Home > Library > Kent Center Occasional Papers > Improving CIA Analytic Performance: Strategic Warning

Improving CIA Analytic Performance: Strategic Warning

The Sherman Kent Center for Intelligence Analysis

The Sherman Kent Center for Intelligence Analysis

Occasional Papers: Volume 1, Number 1, Sept. ‘02

Improving CIA Analytic Performance:
Strategic Warning
Jack Davis,
Sherman Kent Center


A host of reports have been written over the 50 years of CIA history evaluat­ing analytic performance and recommending changes in priorities and trade­craft. These “post-mortem reports” have been issued by Agency leaders and components as well as by Congressional committees and commissions and non-governmental organizations concerned about intelli­gence performance. Starting with the 1990s, post-mortem reports in­creased in number, generated both by charges of specific intelligence failures and by general recognition that the post-Cold War period presented new challenges to intelligence.

The recent post-mortem reports have helped Directorate of Intelligence leaders to examine current doctrine and practice critically, and to address identified challenges in training programs. This Occasional Paper is one of a series of assessments of what recent critiques have said about the key challenges facing the DI in the new century.

The present paper addresses the challenges of strategic warning. It reviews five post-mortem critiques: (1) Douglas J. MacEachin, “Tradecraft of Analy­sis,” U.S. Intelligence at the Crossroads: Agendas for Reform (1995); (2) Adm. David Jeremiah (R), Intelligence Community’s Performance on the In­dian Nuclear Tests (1998); (3) CIA, Office of Inspector General, Alternative Analysis in the Directorate of Intelligence (1999); (4) Report of the Commis­sion to Assess the Ballistic Missile Threat to the United States (1998); (5) Working Group on Intelligence Re­form of the National Strategy Information Center, The Future of US Intelligence (1996).

 

Substantive Uncertainty and Strategic Warning

The central task of intelligence analysis is to help US officials—policymakers, warfighters, negotiators, law enforcers—deal more effectively with substantive uncertainty, and especially to provide timely warning of military attacks and other threats to US national security interests. Tactical (incident) warning is a major DI responsibility, focusing on hot-button issues such as terrorism, WMD developments, and political instability. Identifying when, where, and how a declared or potential adversary will strike the United States directly, mount a challenge to US interest abroad, or make a weapons breakthrough is the highest priority of the DI’s current intelligence effort.

Recent post-mortem studies have focused however on strategic warning, the subject of this memorandum. Strategic warning can be defined as timely analytic perception and effective communication to policy officials of important changes in the level or character of threats to national security interests that require re-evaluation of US readiness to deter or limit damage. The goal is to prevent strategic surprise. The issues addressed here are changes in the level of likelihood that an enemy will strike or that a development harmful to US interests will take place and changes in his mechanisms for inflicting damage.

Illustrative threats on which intelligence can help policy officials determine an appropriate level of general preparedness include (1) attacks against the United States and its interests abroad by states and non-state actors via military, terrorist, and other means, (2) collapse of stability from domestic dynamics in a country important to US security, (3) major changes in an adversary’s strategy and practice affecting WMD proliferation or international terrorism.

Strategic warning is an unrelenting, often painful, challenge to both intelligence analysts and policymakers. Major surprises over the decades—that is, failures to warn effectively—include Pearl Harbor (1941), Communist attacks on South Korea (1950), Soviet invasion of Czechoslovakia (1968), the Iran revolution (1979), and Iraq’s invasion of Kuwait (1990).

Key to the warning challenge is that the substantive uncertainty surrounding threats to US interests requires analysts, and policymakers, to make judgments that are inherently vulnerable to error. Analysts must issue a strategic warning far enough in advance of the feared event for US officials to have an opportunity to take protective action, yet with the credibility to motivate them to do so. No mean feat. Waiting for evidence the enemy is at the gate usually fails the timeliness test; prediction of potential crises without hard evidence can fail the credibility test. When analysts are too cautious in estimative judgments on threats, they brook blame for failure to warn. When too aggressive in issuing warnings, they brook criticism for “crying wolf.”

Analysts face two special challenges regarding strategic warning: overcoming their own mindset and that of policy officials. Especially with issues of major policy interest on which analysts have reached an agreed estimate judgment and reported it often, they find it difficult to take the measure of disconfirming information and to explore alternative plausible meanings of gaps in diagnostic information caused by adversarial Denial and Deception (D&D) operations. Especially on issues on which US leaders have not yet focused and analysts have not reached a confident consensus, it is difficult to overcome decisionmaker aversion to undertaking costly, unpopular, and otherwise inconvenient countermeasures.

Policymakers face their own thankless challenges regarding warning. US military and other national security resources are limited, including the time and attention of national leaders, who usually must deal with domestic political and policy issues as well as foreign challenges. The opportunity costs can be high if these resources are inappropriately allocated to ward off one threat that does not materialize to the neglect of another threat that does.

A thoughtful senior policy official has opined that most potentially devastating threats to US interests start out being evaluated as unlikely. The key to effective intelligence-policy relations in strategic warning is for analysts to help policy officials in determining which seemingly unlikely threats are worthy of serious consideration.

For better or worse, neither the DI nor its critics keeps a scorecard of strategic warnings

that have been successfully executed. As indicated below, real and perceived failures to warn have brought forth critical internal as well as external examinations of analytic performance. It is mostly from study of failure, then, that DI analysts can learn lessons about the challenges of uncertainty, surprise, and warning.

 

Analytic Tradecraft for Managing Substantive Uncertainty

The failure to provide strategic warning during the months prior to Iraq’s 1990 invasion of Kuwait generated recommendations for revamping warning analysis by DDI Doug MacEachin (1993-1995) that spurred changes in the DI’s analytic approach to substantive uncertainty generally.

The DDI observed that the bottom-line judgment that Iraq was unlikely to initiate warfare in the near term, issued repeatedly in the year before the assault on Kuwait, was based on the assumption that Iraq needed several years to recover from the military and economic devastation of its long war with Iran. That assumption was so widely held by analysts that it was rarely examined critically. Nor was the heavy dependence of the no-war conclusion on the recovery-first assumption explicitly recognized.

The DDI criticized the prevailing approach to substantive uncertainty as a “predictions sweepstake” that emphasized competition among analysts to control bottom-line judgments rather than a structured appraisal of the soundness of the analytic case for alternative plausible dynamics and outcomes. In contrast, the more rigorous tradecraft for dealing with substantive uncertainty he recommended—sometimes called “Linchpin Analysis”— requires careful attention to selecting the factors at play deemed most likely to drive and determine the outcome of a situation on which there is too little hard information to rely on a flat prediction.

For example, analysts self-consciously assess alternative views on which players, forces, and relationships will likely determine whether country X will attack country Y. These key factors or linchpins are explicitly conveyed in the assessment as the basis for estimative conclusions. In a strategic warning regime, attention is then paid to identifying triggers (plausible developments that could uncouple the linchpins holding the argument together), and signposts (early indicators that the bottom-line judgment needs revision).

The DDI conveys the essential character of what he labeled “forecasting” (the Linchpin process), which he differentiates from “fortune-telling” (intelligence judgments focused on asserting a bottom-line judgment) in an essay on “Tradecraft of Analysis,” published in US Intelligence at the Crossroads: Agendas for Reform (1995, Roy Godson, et. al, editors).

Analyses of potential developments are based on assessments of factors that together would logically bring about a certain future. These factors are the “drivers” or “linchpins” of the analysis. If one or more of them should change, or be removed, or turn out to have been wrong to start with, the basis for the forecast would no longer hold.

  • Identifying the role of these factors in the analytic calculus is a fundamental requirement of sound intelligence forecasts. The policymaker needs to know the potential impact of changes in these “linchpins.”
  • The consumer especially needs to know if for any of these “linchpins” the evidence is particularly thin, there is high uncertainty, or there is no empirical evidence, but only assumptions based on past practice or what appear to be logical extensions of what is known.

Careful attention to selection and testing of key assumptions to deal with substantive uncertainty is now well established as the doctrinal standard for DI analytic tradecraft, and is a key part of instruction in Kent School’s CAP curriculum for new analysts.

 

Averting Strategic Surprise through Alternative Analysis

Because of competing priorities (for example, production speed vs. analytic rigor), doctrinal innovation does not always determine analyst practice. Later in the decade, two additional critical studies of warning intelligence were triggered by the perceived failure to anticipate that a new government in India would act quickly on its campaign pledge to resume nuclear testing (as it did in May 1998). The Jeremiah Report (Intelligence Community’s Performance on the Indian Nuclear Tests, June 1998) and the Office of Inspector General (OIG) report (Alternative Analysis in the Directorate of Intelligence, May 1999) reiterated criticism of insufficient attention by managers as well as analysts to testing assumptions and taking account of alternative dynamics and outcomes.

Both critiques commented on organizational as well as analytic shortcomings. The Jeremiah report recognized the constraints on strategic warning of collection and analytic resource limitations brought on by post-Cold War “downsizing” of intelligence. And the OIG report pointed to pressures on Agency analysts for speed, conciseness, and judgmental decisiveness as obstacles to employing more deliberate analytic tradecraft for combating substantive uncertainty.

The reports called for greater recourse to the techniques of Alternative Analysis, first for more rigid testing of prevailing judgments and then to take more deliberate account of seemingly less likely but potentially high-impact developments.

Admiral Jeremiah stressed the need to institutionalize use of alternative analytic approaches on complex issues when a change of government or other threshold event increases the likelihood of departures from prevailing analytic assumptions about political and military dynamics. One of the main cognitive traps analysts must overcome is mirror-imaging—estimating the risk-benefit calculations of a foreign government or non-state group based on what would make sense in a US or Western Europe context.

In addition to enhanced training and other internal mechanisms to ensure greater critical thinking by the analysts themselves, Jeremiah recommended two external fixes to ensure that “more rigor…go[es] into analysts’ thinking when major events take place.”

A) Bring in outside substantive experts in a more systematic fashion [so that we work against this “everybody thinks like us” mind set].

B) Bring in experts in the process of analysis when the IC faces a transition on a major intelligence issue. These analytic thinkers would serve, together with substantive specialists, as “Red Teams” on major analytic problems and would work with analysts to study assumptions, mirror-imaging, and complex analytic processes

The OIG report acknowledged numerous useful DI activities to promote critical thinking, but made a series of recommendations calling for greater management buy-in and analyst training to ensure more frequent and more effective use of Alternative Analysis.

  • Establish guidelines…for when and how alternative analysis techniques and approaches are best applied to an intelligence issue and…fuller representation of sound minority views and outcome uncertainties are to be incorporated in…finished intelligence products.
  • Establish a mechanism for routinely identifying best practice in alternative analysis both within and outside the DI.
  • Articulate a comprehensive plan for improving alternative analysis that clearly links investment priorities to specific goals.
  • Review [and improve] the Directorate’s… analytic methodology support infrastructure.
  • Implement a training curriculum…that provides in-depth exposure to…alternative analysis tools and presentation techniques …focusing first on training for managers.

The DI in response to both critical studies has substantially increased attention to the wide range of undertakings and tradecraft techniques under the rubric of Alternative Analysis. For example, the Offices have expanded use of outside substantive experts to generate and test analytic assumptions. Analysts have increased their use of techniques such as red teaming (role-playing an adversary’s calculations), Devil’s Advocacy (deliberate challenge of a DI team’s strongly-held analytic views), and Team A-Team B analysis (competitive assessments) in order to focus greater attention on High Impact-Low Probability threats to US national security interests.

Regarding formal training, Kent School runs a monthly Alternative Analysis Workshop and has introduced an AA unit into the CAP. Through the Global Futures Partnership, the Kent Center sponsors scenario exercises on key issue trends and conferences on organizational and conceptual requirements for anticipating changes affecting US security interests.

 

Taking Greater Account of Denial and Deception

Claiming that analysts have often been years late in detecting menacing WMD developments, The Report of the Commission to Assess the Ballistic Missile Threat to the United States (July 1998) criticized intelligence for insufficient attention to Denial and Deception (D&D) and other obstacles to reliable judgments on national security threats. Known as the Rumsfeld Commission after its Chairman (and current Secretary of Defense) Donald Rumsfeld, the report insists analysts take greater account of what they do not know in providing policy officials with the intelligence back-up for planning against the threat of rogue regime missile developments.

From the Commission’s viewpoint, the analysts’ bottom-line judgment on when and how an adversary will be capable of threatening US interests is often too dependent on assessing available hard evidence on what said adversary has achieved, and also on the highly-structured model for weapons development of the former Soviet Union. In a variant of Alternative Analysis, the report calls upon analysts to search for evidence that would disprove an adversary’s reliance on alternative technologies, non-Soviet methodologies, and new paths toward a menacing potential. But as long as these more alarming paths cannot be ruled out, analysts must assess the implications of a rogue regime, say, buying or stealing missile technology and deploying weapons without elaborate (and thus detectable) testing.

The report elaborates on employing the technique of “alternative hypotheses”:

This technique can help make sense of known events and serve as a way to identify indicators relative to a [missile] program’s motivation, purpose, pace and direction. By hypothesizing alternative scenarios a more adequate set of indicators and collection priorities can be established. As the indicators begin to align with the known facts, the importance of the information gaps is reduced and the likely outcomes projected with greater confidence. The result is the possibility for earlier warning than if analysts wait for proof of a capability in the form of hard evidence of a test or a deployment.

In other words, policymakers need to be warned of what the threat potential is if analysts are wrong in their major assumptions, as well as the level of threat if they are right.

This standard for analyst participation in the warning process could be applicable whenever policymakers grapple with costly or politically charged defense issues. Commission member Paul Wolfowitz (now Deputy Secretary of Defense) opined in 1995 that the proper role of intelligence is to serve as a tool for effective debate among competing policymakers (via multiple outcome analysis)—and not as a weapon that one group of policymakers can wield against the others (single-outcome analysis).

Kent School workshops on AA and D&D, and like units in the CAP, work to increase understanding of the general tradecraft challenges levied by the Rumsfeld report. Also, a Kent Center Fellow is currently working on a methodology that addresses the challenges specifically of keeping track of rogue regime ballistic missile developments.

Strategic Warning: Role of the Analyst

The Rumsfeld Commission’s call for rethinking the analyst’s role in the warning process echoes a 1996 critique of intelligence performance issued by the Working Group on Intelligence Reform of the National Strategy Information Center. The Future of US Intelligence defines governmental assessments of the character of threats in order to establish an appropriate level of national security readiness as the essential output of the warning process. The Working Group sets the same standard for warning analysis that it does for intelligence analysis generally—not to attempt to predict the future but to provide analysis that helps policy officials shape the future. Regarding the warning process:

The measure of effectiveness is not “were we surprised” but “were we at an appropriate level of readiness.” In case of a “surprise attack” it is better to be subjectively surprised [by a specific incident] but at a high level of readiness [regarding a general threat], than to be effectively unready, even though expecting the attack.

The role of analysts in this strategic warning regime is to leverage their expertise on foreign developments, first to help government officials determine appropriate levels of preparedness for identified national security threats and second to provide actionable assessments to ward off or minimize the dangers. Once analysts perceive policymakers have taken their warning on board, their second obligation includes helping policymakers identify and examine critically various measures to deter and limit damage.

This call for changing the analysts’ role in the warning process was mostly overlooked in the heavy flow of recommendations for improving intelligence performance issued during the 1990s. Redefinition of the analysts’ role, perhaps radical change, will likely get deliberate attention in the Congressional and other post-mortem assessments generated in response to the “surprise” terrorist attack of 11 September 2001.

 

Summary Recommendations

No matter what the future role of intelligence in the strategic warning process, the challenge to DI analysts of effective battle against substantive uncertainty will remain unrelenting—at times punishing. Three summary recommendations are worthy of consideration.

  1. Analysts, including new analysts, must balance their professional commitment to increased mastery of what can be known of their accounts (substantive expertise) with their commitment to enhanced skills for dealing with what cannot be known (tradecraft expertise).
    • Perhaps the most painful lesson of 11 September 2001 is that, at least regarding individual incidents, surprise attacks are inevitable. Analysts will often have to decide whether and how to provide strategic warning convincingly despite the absence of a “smoking gun” report.
  1. While there is no magic bullet for averting strategic surprise, tradecraft skills for undertaking Alternative Analysis and for countering D&D can improve the chances of success, and thus are every-day professional responsibilities for all DI analysts, not just for methodologists and specialists.
    • Analysts must master the skills for effective challenge of their own assumptions and tough-minded evaluation of the authenticity and general adequacy of classified as well as open source information—before, not after, taking on difficult substantive assignments.
  1. The more analysts know about the US policymaking process and the more they understand the challenges facing their policymaking counterparts, the better positioned intelligence will be for any assigned role in strategic warning.
    • Absent a windfall of smoking-gun information, for analysts to warn effectively they must understand how their key clients set their issue priorities, debate and otherwise process decisions with their policy peers, absorb experts’ views and “bad news,” and prefer to deal with substantive uncertainty.

 


Disclaimer:

All statements of fact, opinion, or analysis expressed in Occasional Papers are those of the authors. They do not necessarily reflect official positions or views of the Kent School, the Central Intelligence Agency, or any other US Government entity, past or present.

Nothing in the contents should be construed as asserting or implying US Government endorsement of an article's factual statements and interpretations.

These papers have been prepared with the support of Central Intelligence Agency funds and are published with the consent of the authors.


Posted: Apr 21, 2007 07:58 PM
Last Updated: Nov 08, 2007 02:04 PM
Last Reviewed: Apr 21, 2007 07:58 PM