Psychology of Intelligence Analysis
Improving Intelligence Analysis
at CIA:
Dick Heuer's Contribution
to Intelligence Analysis
by Jack Davis1
I applaud CIA's Center for the Study of Intelligence for making the work of Richards J. Heuer, Jr. on the psychology of intelligence analysis available to a new generation of intelligence practitioners and scholars.
Dick Heuer's ideas on how to improve analysis focus on helping analysts compensate for the human mind's limitations in dealing with complex problems that typically involve ambiguous information, multiple players, and fluid circumstances. Such multi-faceted estimative challenges have proliferated in the turbulent post-Cold War world.
Heuer's message to analysts can be encapsulated by quoting two sentences from Chapter 4 of this book:
Intelligence analysts should be self-conscious about their reasoning processes. They should think about how they make judgments and reach conclusions, not just about the judgments and conclusions themselves.
Heuer's ideas are applicable to any analytical endeavor. In this
Introduction, I have concentrated on his impact--and that of other
pioneer thinkers in the intelligence analysis field--at CIA, because
that is the institution that Heuer and his predecessors, and I myself,
know best, having spent the bulk of our intelligence careers there.
Leading Contributors to Quality of Analysis
Intelligence analysts, in seeking to make sound judgments, are always
under challenge from the complexities of the issues they address and
from the demands made on them for timeliness and volume of production.
Four Agency individuals over the decades stand out for having made
major contributions on how to deal with these challenges to the quality
of analysis.
My short list of the people who have had the greatest
positive impact on CIA analysis consists of Sherman Kent, Robert Gates,
Douglas MacEachin, and Richards Heuer. My selection methodology was
simple. I asked myself: Whose insights have influenced me the most
during my four decades of practicing, teaching, and writing about
analysis?
Sherman Kent
Sherman Kent's pathbreaking
contributions to analysis cannot be done justice in a couple of
paragraphs, and I refer readers to fuller treatments elsewhere.2 Here I address his general legacy to the analytical profession.
Kent, a professor of European history at Yale, worked in the Research
and Analysis branch of the Office of Strategic Services during World
War II. He wrote an influential book, Strategic Intelligence for American World Power,
while at the National War College in the late 1940s. He served as Vice
Chairman and then as Chairman of the DCI's Board of National Estimates
from 1950 to 1967.
Kent's greatest contribution to the quality of analysis
was to define an honorable place for the analyst--the thoughtful
individual "applying the instruments of reason and the scientific
method"--in an intelligence world then as now dominated by collectors
and operators. In a second (1965) edition of Strategic Intelligence,
Kent took account of the coming computer age as well as human and
technical collectors in proclaiming the centrality of the analyst:
Whatever the complexities of the
puzzles we strive to solve and whatever the sophisticated techniques we
may use to collect the pieces and store them, there can never be a time
when the thoughtful man can be supplanted as the intelligence device
supreme.
More specifically, Kent advocated
application of the techniques of "scientific" study of the past to
analysis of complex ongoing situations and estimates of likely future
events. Just as rigorous "impartial" analysis could cut through the
gaps and ambiguities of information on events long past and point to
the most probable explanation, he contended, the powers of the critical
mind could turn to events that had not yet transpired to determine the
most probable developments.3
To this end, Kent developed the concept of the analytic pyramid,
featuring a wide base of factual information and sides comprised of
sound assumptions, which pointed to the most likely future scenario at
the apex. 4
In his proselytizing and in practice, Kent battled against bureaucratic
and ideological biases, which he recognized as impediments to sound
analysis, and against imprecise estimative terms that he saw as
obstacles to conveying clear messages to readers. Although he was aware
of what is now called cognitive bias, his writings urge analysts to
"make the call" without much discussion of how limitations of the human
mind were to be overcome.
Not many Agency analysts read Kent nowadays. But he had
a profound impact on earlier generations of analysts and managers, and
his work continues to exert an indirect influence among practitioners
of the analytic profession.
Robert Gates
Bob Gates served as Deputy
Director of Central Intelligence (1986-1989) and as DCI (1991-1993).
But his greatest impact on the quality of CIA analysis came during his
1982-1986 stint as Deputy Director for Intelligence (DDI).
Initially schooled as a political scientist, Gates earned a
Ph.D. in Soviet studies at Georgetown while working as an analyst at
CIA. As a member of the National Security Council staff during the
1970s, he gained invaluable insight into how policymakers use
intelligence analysis. Highly intelligent, exceptionally hard-working,
and skilled in the bureaucratic arts, Gates was appointed DDI by DCI
William Casey in good part because he was one of the few insiders Casey
found who shared the DCI's views on what Casey saw as glaring
deficiencies of Agency analysts. 5
Few analysts and managers who heard it have forgotten Gates' blistering
criticism of analytic performance in his 1982 "inaugural" speech as
DDI.
Most of the public commentary on Gates and Agency
analysis concerned charges of politicization levied against him, and
his defense against such charges, during Senate hearings for his 1991
confirmation as DCI. The heat of this debate was slow to dissipate
among CIA analysts, as reflected in the pages of Studies in Intelligence, the Agency journal founded by Sherman Kent in the 1950s.6
I know of no written retrospective on Gates' contribution to Agency
analysis. My insights into his ideas about analysis came mostly through
an arms-length collaboration in setting up and running an Agency
training course entitled "Seminar on Intelligence Successes and
Failures."7
During his tenure as DDI, only rarely could you hold a conversation
with analysts or managers without picking up additional viewpoints,
thoughtful and otherwise, on what Gates was doing to change CIA
analysis.
Gates's ideas for overcoming what he saw as insular,
flabby, and incoherent argumentation featured the importance of
distinguishing between what analysts know and what they believe--that
is, to make clear what is "fact" (or reliably reported information) and
what is the analyst's opinion (which had to be persuasively supported
with evidence). Among his other tenets were the need to seek the views
of non-CIA experts, including academic specialists and policy
officials, and to present alternate future scenarios.
Gates's main impact, though, came from practice--from
his direct involvement in implementing his ideas. Using his authority
as DDI, he reviewed critically almost all in-depth assessments and
current intelligence articles prior to publication.
With help from his deputy and two rotating assistants from the ranks of
rising junior managers, Gates raised the standards for DDI review
dramatically--in essence, from "looks good to me" to "show me your
evidence."
As the many drafts Gates rejected were sent back to
managers who had approved them--accompanied by the DDI's comments about
inconsistency, lack of clarity, substantive bias, and poorly supported
judgments--the whole chain of review became much more rigorous.
Analysts and their managers raised their standards to avoid the pain of
DDI rejection. Both career advancement and ego were at stake.
The rapid and sharp increase in attention paid by
analysts and managers to the underpinnings for their substantive
judgments probably was without precedent in the Agency's history. The
longer term benefits of the intensified review process were more
limited, however, because insufficient attention was given to
clarifying tradecraft
practices that would promote analytic soundness. More than one
participant in the process observed that a lack of guidelines for
meeting Gates's standards led to a large amount of "wheel-spinning."
Gates's impact, like Kent's, has to be seen on two
planes. On the one hand, little that Gates wrote on the craft of
analysis is read these days. But even though his pre-publication review
process was discontinued under his successors, an enduring awareness of
his standards still gives pause at jumping to conclusions to many
managers and analysts who experienced his criticism first-hand.
Douglas MacEachin
Doug MacEachin, DDI
from 1993 to 1996, sought to provide an essential ingredient for
ensuring implementation of sound analytic standards: corporate tradecraft
standards for analysts. This new tradecraft was aimed in particular at
ensuring that sufficient attention would be paid to cognitive
challenges in assessing complex issues.
MacEachin set out his views on Agency analytical faults and correctives in The Tradecraft of Analysis: Challenge and Change in the CIA.8 My commentary on his contributions to sound analysis is also informed by a series of exchanges with him in 1994 and 1995.
MacEachin's university major was economics, but he also showed great
interest in philosophy. His Agency career--like Gates'--included an
extended assignment to a policymaking office. He came away from this
experience with new insights on what constitutes "value-added"
intelligence usable by policymakers. Subsequently, as CIA's senior
manager on arms control issues, he dealt regularly with a cadre of
tough-minded policy officials who let him know in blunt terms what
worked as effective policy support and what did not.
By the time MacEachin became DDI in 1993, Gates's
policy of DDI front-office pre-publication review of nearly all DI
analytical studies had been discontinued. MacEachin took a different
approach; he read--mostly on weekends--and reflected on numerous
already-published DI analytical papers. He did not like what he found.
In his words, roughly a third of the papers meant to assist the
policymaking process had no discernible argumentation to bolster the
credibility of intelligence judgments, and another third suffered from
flawed argumentation. This experience, along with pressures on CIA for
better analytic performance in the wake of alleged "intelligence
failures" concerning Iraq's invasion of Kuwait, prompted his decision
to launch a major new effort to raise analytical standards.9
MacEachin advocated an approach to structured argumentation called
"linchpin analysis," to which he contributed muscular terms designed to
overcome many CIA professionals' distaste for academic nomenclature.
The standard academic term "key variables" became drivers. "Hypotheses" concerning drivers became linchpins--assumptions
underlying the argument--and these had to be explicitly spelled out.
MacEachin also urged that greater attention be paid to analytical
processes for alerting policymakers to changes in circumstances that
would increase the likelihood of alternative scenarios.
MacEachin thus worked to put in place systematic and
transparent standards for determining whether analysts had met their
responsibilities for critical thinking. To spread understanding and
application of the standards, he mandated creation of workshops on
linchpin analysis for managers and production of a series of notes on
analytical tradecraft. He also directed that the DI's performance on
tradecraft standards be tracked and that recognition be given to
exemplary assessments. Perhaps most ambitious, he saw to it that
instruction on standards for analysis was incorporated into a new
training course, "Tradecraft 2000." Nearly all DI managers and analysts
attended this course during 1996-97.
As of this writing (early 1999), the long-term staying
power of MacEachin's tradecraft initiatives is not yet clear. But much
of what he advocated has endured so far. Many DI analysts use
variations on his linchpin concept to produce soundly argued forecasts.
In the training realm, "Tradecraft 2000" has been supplanted by a new
course that teaches the same concepts to newer analysts. But examples
of what MacEachin would label as poorly substantiated analysis are
still seen. Clearly, ongoing vigilance is needed to keep such analysis
from finding its way into DI products.
Richards Heuer
Dick Heuer was--and
is--much less well known within the CIA than Kent, Gates, and
MacEachin. He has not received the wide acclaim that Kent enjoyed as
the father of professional analysis, and he has lacked the bureaucratic
powers that Gates and MacEachin could wield as DDIs. But his impact on
the quality of Agency analysis arguably has been at least as important
as theirs.
Heuer received a degree in philosophy in 1950 from Williams
College, where, he notes, he became fascinated with the fundamental
epistemological question, "What is truth and how can we know it?" In
1951, while a graduate student at the University of California's
Berkeley campus, he was recruited as part of the CIA's buildup during
the Korean War. The recruiter was Richard Helms, OSS veteran and rising
player in the Agency's clandestine service. Future DCI Helms, according
to Heuer, was looking for candidates for CIA employment among recent
graduates of Williams College, his own alma mater. Heuer had an added
advantage as a former editor of the college's newspaper, a position
Helms had held some 15 years earlier.10
In 1975, after 24 years in the Directorate of Operations, Heuer moved
to the DI. His earlier academic interest in how we know the truth was
rekindled by two experiences. One was his involvement in the
controversial case of Soviet KGB defector Yuriy Nosenko. The other was
learning new approaches to social science methodology while earning a
Master's degree in international relations at the University of
Southern California's European campus.
At the time he retired in 1979, Heuer headed the
methodology unit in the DI's political analysis office. He originally
prepared most of the chapters in this book as individual articles
between 1978 and 1986; many of them were written for the DI after his
retirement. He has updated the articles and prepared some new material
for inclusion in this book.
Heuer's Central Ideas
Dick Heuer's writings make three fundamental points about the cognitive challenges intelligence analysts face:
-
The mind is poorly "wired" to deal effectively with both inherent
uncertainty (the natural fog surrounding complex, indeterminate
intelligence issues) and induced uncertainty (the man-made fog
fabricated by denial and deception operations).
- Even increased
awareness of cognitive and other "unmotivated" biases, such as the
tendency to see information confirming an already-held judgment more
vividly than one sees "disconfirming" information, does little by
itself to help analysts deal effectively with uncertainty.
- Tools and techniques
that gear the analyst's mind to apply higher levels of critical
thinking can substantially improve analysis on complex issues on which
information is incomplete, ambiguous, and often deliberately distorted.
Key examples of such intellectual devices include techniques for
structuring information, challenging assumptions, and exploring
alternative interpretations.
The following passage from
Heuer's 1980 article entitled "Perception: Why Can't We See What Is
There to be Seen?" shows that his ideas were similar to or compatible
with MacEachin's concepts of linchpin analysis.
Given the difficulties inherent in the human processing of complex information, a prudent management system should:
-
Encourage products that (a) clearly delineate their assumptions and
chains of inference and (b) specify the degree and source of the
uncertainty involved in the conclusions.
- Emphasize
procedures that expose and elaborate alternative points of
view--analytic debates, devil's advocates, interdisciplinary
brainstorming, competitive analysis, intra-office peer review of
production, and elicitation of outside expertise.
Heuer emphasizes both the value and the dangers of mental models, or mind-sets. In the book's opening chapter, entitled "Thinking About Thinking," he notes that:
[Analysts] construct their own version of "reality" on the basis of
information provided by the senses, but this sensory input is mediated
by complex mental processes that determine which information is
attended to, how it is organized, and the meaning attributed to it.
What people perceive, how readily they perceive it, and how they
process this information after receiving it are all strongly influenced
by past experience, education, cultural values, role requirements, and
organizational norms, as well as by the specifics of the information
received.
This process may be visualized as perceiving
the world through a lens or screen that channels and focuses and
thereby may distort the images that are seen. To achieve the clearest
possible image...analysts need more than information...They also need
to understand the lenses through which this information passes. These
lenses are known by many terms--mental models, mind-sets, biases, or
analytic assumptions.
In essence, Heuer sees reliance on mental
models to simplify and interpret reality as an unavoidable conceptual
mechanism for intelligence analysts--often useful, but at times
hazardous. What is required of analysts, in his view, is a commitment
to challenge, refine, and challenge again
their own working mental models, precisely because these steps are
central to sound interpretation of complex and ambiguous issues.
Throughout the book, Heuer is critical of the orthodox
prescription of "more and better information" to remedy unsatisfactory
analytic performance. He urges that greater attention be paid instead
to more intensive exploitation of information already on hand, and that
in so doing, analysts continuously challenge and revise their mental
models.
Heuer sees mirror-imaging as an example of
an unavoidable cognitive trap. No matter how much expertise an analyst
applies to interpreting the value systems of foreign entities, when the
hard evidence runs out the tendency to project the analyst's own
mind-set takes over. In Chapter 4, Heuer observes:
To see the options faced by
foreign leaders as these leaders see them, one must understand their
values and assumptions and even their misperceptions and
misunderstandings. Without such insight, interpreting foreign leaders'
decisions or forecasting future decisions is often nothing more than
partially informed speculation. Too frequently, foreign behavior
appears "irrational" or "not in their own best interest." Such
conclusions often indicate analysts have projected American values and
conceptual frameworks onto the foreign leaders and societies, rather
than understanding the logic of the situation as it appears to them.
Competing Hypotheses
To
offset the risks accompanying analysts' inevitable recourse to
mirror-imaging, Heuer suggests looking upon analysts' calculations
about foreign beliefs and behavior as hypotheses to be challenged.
Alternative hypotheses need to be carefully considered--especially
those that cannot be disproved on the basis of available information.
Heuer's concept of "Analysis of Competing Hypotheses" (ACH) is among
his most important contributions to the development of an intelligence
analysis methodology. At the core of ACH is the notion of competition
among a series of plausible hypotheses to see which ones survive a
gauntlet of testing for compatibility with available information. The
surviving hypotheses--those that have not been disproved--are subjected
to further testing. ACH, Heuer concedes, will not always yield the
right answer. But it can help analysts overcome the cognitive
limitations discussed in his book.
Some analysts who use ACH follow Heuer's full
eight-step methodology. More often, they employ some elements of
ACH--especially the use of available information to challenge the
hypotheses that the analyst favors the most.
Denial and Deception
Heuer's
path-breaking work on countering denial and deception (D&D) was not
included as a separate chapter in this volume. But his brief references
here are persuasive.
He notes, for example, that analysts often reject the
possibility of deception because they see no evidence of it. He then
argues that rejection is not justified under these circumstances. If
deception is well planned and properly executed, one should not expect
to see evidence of it readily at hand. Rejecting a plausible but
unproven hypothesis too early tends to bias the subsequent analysis,
because one does not then look for the evidence that might support it.
The possibility of deception should not be rejected until it is
disproved or, at least, until a systematic search for evidence has been
made and none has been found.
Heuer's Impact
Heuer's influence
on analytic tradecraft began with his first articles. CIA officials who
set up training courses in the 1980s as part of then-DDI Gates's quest
for improved analysis shaped their lesson plans partly on the basis of
Heuer's findings. Among these courses were a seminar on intelligence
successes and failures and another on intelligence analysis. The
courses influenced scores of DI analysts, many of whom are now in the
managerial ranks. The designers and teachers of Tradecraft 2000 clearly
were also influenced by Heuer, as reflected in reading selections, case
studies, and class exercises.
Heuer's work has remained on reading lists and in
lesson plans for DI training courses offered to all new analysts, as
well as courses on warning analysis and on countering denial and
deception. Senior analysts and managers who have been directly exposed
to Heuer's thinking through his articles, or through training courses,
continue to pass his insights on to newer analysts.
Recommendations
Heuer's advice to
Agency leaders, managers, and analysts is pointed: To ensure sustained
improvement in assessing complex issues, analysis must be treated as
more than a substantive and organizational process. Attention also must
be paid to techniques and tools for coping with the inherent
limitations on analysts' mental machinery. He urges that Agency leaders
take steps to:
-
Establish an organizational environment
that promotes and rewards the kind of critical thinking he
advocates--or example, analysis on difficult issues that considers in
depth a series of plausible hypotheses rather than allowing the first
credible hypothesis to suffice.
-
Expand funding for research
on the role such mental processes play in shaping analytical judgments.
An Agency that relies on sharp cognitive performance by its analysts
must stay abreast of studies on how the mind works--i.e., on how analysts reach judgments.
-
Foster development of tools
to assist analysts in assessing information. On tough issues, they need
help in improving their mental models and in deriving incisive findings
from information they already have; they need such help at least as
much as they need more information.
I offer some concluding
observations and recommendations, rooted in Heuer's findings and taking
into account the tough tradeoffs facing intelligence professionals:
-
Commit to a uniform set of tradecraft standards based on the insights in this book. Leaders
need to know if analysts have done their cognitive homework before
taking corporate responsibility for their judgments. Although every
analytical issue can be seen as one of a kind, I suspect that nearly
all such topics fit into about a dozen recurring patterns of challenge
based largely on variations in substantive uncertainty and policy
sensitivity. Corporate standards need to be established for each such
category. And the burden should be put on managers to explain why a
given analytical assignment requires deviation from the standards. I am
convinced that if tradecraft standards are made uniform and
transparent, the time saved by curtailing personalistic review of
quick-turnaround analysis (e.g., "It reads better to me this way")
could be "re-invested" in doing battle more effectively against
cognitive pitfalls. ("Regarding point 3, let's talk about your
assumptions.")
-
Pay more honor to "doubt."
Intelligence leaders and policymakers should, in recognition of the
cognitive impediments to sound analysis, establish ground rules that
enable analysts, after doing their best to clarify an issue, to express
doubts more openly. They should be encouraged to list gaps in
information and other obstacles to confident judgment. Such conclusions
as "We do not know" or "There are several potentially valid ways to
assess this issue" should be regarded as badges of sound analysis, not
as dereliction of analytic duty.
-
Find a couple of successors to Dick Heuer. Fund their research. Heed their findings.
Footnotes
1Jack
Davis served with the Directorate of Intelligence (DI), the National
Intelligence Council, and the Office of Training during his CIA career.
He is now an independent contractor who specializes in developing and
teaching analytic tradecraft. Among his publications is Uncertainty, Surprise, and Warning (1996).
2See, in particular, the editor's unclassified introductory essay and "Tribute" by Harold P. Ford in Donald P. Steury, Sherman Kent and the Board of National Estimates: Collected Essays (CIA, Center for the Study of Intelligence, 1994). Hereinafter cited as Steury, Kent.
3Sherman Kent, Writing History,
second edition (1967). The first edition was published in 1941, when
Kent was an assistant professor of history at Yale. In the first
chapter, "Why History," he presented ideas and recommendations that he
later adapted for intelligence analysis.
4Kent, "Estimates and Influence" (1968), in Steury, Kent.
5Casey,
very early in his tenure as DCI (1981-1987), opined to me that the
trouble with Agency analysts is that they went from sitting on their
rear ends at universities to sitting on their rear ends at CIA, without
seeing the real world.
6"The Gates Hearings: Politicization and Soviet Analysis at CIA", Studies in Intelligence (Spring 1994). "Communication to the Editor: The Gates Hearings: A Biased Account," Studies in Intelligence (Fall 1994).
7DCI
Casey requested that the Agency's training office provide this seminar
so that, at the least, analysts could learn from their own mistakes.
DDI Gates carefully reviewed the statement of goals for the seminar,
the outline of course units, and the required reading list.
8Unclassified
paper published in 1994 by the Working Group on Intelligence Reform,
which had been created in 1992 by the Consortium for the Study of
Intelligence, Washington, DC.
9Discussion between MacEachin and the author of this Introduction, 1994.
10Letter to the author of this Introduction, 1998.