Skip to content. | Skip to navigation

Central Intelligence Agency
The Work of a Nation. The Center of Intelligence

Kent Center Occasional Papers

CIA Home > Library > Kent Center Occasional Papers > Sherman Kent's Final Thoughts on Analyst-Policymaker Relations

Sherman Kent's Final Thoughts on Analyst-Policymaker Relations

The Sherman Kent Center for Intelligence Analysis

Occasional Papers: Volume 2, Number 3, Jun. ‘03

Sherman Kent’s Final Thoughts on Analyst-Policymaker Relations
Jack Davis
Sherman Kent Center

 

Sherman Kent, widely recognized as the single most influential contributor to the analytic doctrine and tradecraft practiced in CIA’s Directorate of Intelligence, was long seized with the importance, and difficulty, of establishing effective relationships between intelligence analysts and policy officials.

Based on World War II experience in the Research and Analysis Branch of OSS, Kent concluded that analysts, ever the junior partners, had to carry the larger burden in managing the relationships with their policy counterparts. This required analysts to reassess regularly the issue of effective ties as challenges and opportunities changed. Over his 17 years of Agency experience (1950-1967), Kent experienced frustrations as well as successes

with what he saw as the central professional challenge of simultaneous service to two demanding masters—analytic integrity and policy clients.

In a series of post-retirement lectures in training courses for CIA and Defense Intelligence Agency analysts, Kent addressed two recurring challenges in analyst-policymaker relations—providing warning and analyzing intentions—that he argued needed fresh examination by each new generation of practitioners. Kent titled these lectures “Aspects of the Relationship between Intelligence Producers and Consumers.” While he admitted, in his final recorded thoughts on the issues, that his generation had found no failsafe formulas to ensure effective ties, he did point to the general paths that he believed needed to be taken.

  • In warning analysis, Kent judged that the analytic and policy “trades” were too distant in their relations. As a result, the “Warnees,” to use Kent’s term, mistrusted the motives and findings of “Warners” and too often failed to take requisite action to avoid dangers and seize opportu­nities. Kent, never wanting for an earthy turn of phrase, quipped, “Warning is like love—it takes two to make it.” The challenge was somehow to introduce much needed mutual understanding and trust into the relationship.
  • In intentions analysis, in contrast, Kent judged that analysts and policymak­ers were at times too close in their thinking about an adversary’s likely course of action. In this case, neither side would take proper measure of new information that could un­dermine a shared conclusion. Kent’s examples in­clude the mis­reading of Soviet intentions prior to the 1962 Cuban Missile Crisis. The chal­lenge here was to introduce more open-minded argumen­tation to the estimative process, not via intuition or worst-case estimating, but via solid alternative reasoning.

Kent’s last recorded observations on policy relations are contained in a series of handwritten manuscripts—written (scrawled, really) with No. 1 pencil on legal-size pads—included with the papers he donated to the Yale University archives. Six are outlines of lectures for CIA’s Office of Training and DOD’s Defense Intelligence School, 1971-1973. One is a note prepared for his retirement speech in December 1967. One other is a response to a request from an intelligence colleague for review of a manuscript on warning.

These sources complement Kent’s 1949 book Strategic Intelligence for American World Policy, based on his wartime experience with OSS’s Research and Analysis Branch, and his essay “Estimates and Influence,” prepared for an Allied conference on intelligence in London in 1965. Both noted the challenge of effectively managing the analyst-policy client relationship, with the latter cautioning that if analysts seek “influence” ahead of “credibility” they are in danger of achieving neither. Very similar challenges still confront intelligence analysts today, some 30 years after Kent rendered his final recorded thoughts on the issues.

 

Warning Analysis: The Danger of Too Distant a Relationship

One of the most often-cited passages from Kent’s Strategic Intelligence explains that the analytic process can be undermined both when analysts are too close in their ties to consumers (inadequate independence) and when they are too distant (inadequate guidance). Kent concluded that, of the two, being too distant was the more harmful to analysts and the national security. He was concerned, in 1949, mainly about the impact of organizational arrangements between the two callings on policymaker trust in analysts’ judgments. [1]

In his 1970s lecture notes, Kent advanced the argument that warning analysis often did not work because of too great a distance in the priorities and mindset of analysts as “Warners” on the one hand, and policy officials as “Warnees” on the other hand. Kent started off by defining the warning process as much more demanding than the issuing of intelligence reports and assessments. [2]

The single central issue of warning is that it is a multi-step process which involves two parties: the Warner and the Warnee.

Warning is not complete until:

(1) The Warner warns
(2) The Warnee hears, believes, and acts.

Kent, in notes for his 1967 “Swan Song” address to CIA colleagues, had set out what he meant by the requirement that the policymaker must act to complete the warning process. Policymakers, in effect, have to acknowledge the fact of having been warned and take the warning “aboard,” at least to the extent of calling policymaking colleagues to meet and discuss the reported threat or policy opportunity.

[There is] no warning if [the analysts’ assessment] (1)…is not read; (2) is read but not believed; (3) believed but not really taken aboard.

In assessing the performance of warning intelligence, Kent made clear he was excluding warning of a Soviet military attack, because that danger was then without peer in its importance and was overseen by the fulltime mechanisms of the Inter-Agency Warning Committee and National Indicators Center. Kent argued that this omission was not the same as producing Shakespeare’s Hamlet “without the prince.” For warning, as he addressed it, still encompassed very important issues. For example:

  • Hostile acts of the USSR which were still far short of an armed attack on the US and its allies. Hungary, 1956. Czechoslovakia, 1968.

  • The possibility of an attempted political change in an important state.

  • Possible war between two friendly states. India-Pakistan.

  • A possible economic crisis [affecting US well-being], etc.

Kent described the “two completely different life styles” of the often reluctant partners in warning analysis—analysts and policymakers—which is a major cause of the disconnect that undermines effective relations. He set out their distinctive “psychological” drives, starting with analysts.

The Warner tries to watch everything in the world and issues a warning when in his opinion the thing he sees coming up is:

  1. Of considerable importance to the national security.
  2. Highly likely (or likely) to take place.
  3. The right time interval away”
    • Not this afternoon. The analyst goofed—too late;
    • Not next year—the analyst is too early.

In examining a something [i.e., a prospective event] to tell how it meets these criteria you [the analysts] realize that you are judging, weighing, estimating.

In some cases the gravity of the something, its likelihood, and the timing of its occurrence is crystal clear. Lucky you. Your own input into the decision to warn is minimal.

Most cases however will require that you do a lot of soul searching.

What is likely to determine how grave? How warning worthy something is?

The “facts,” sure—but in the marginal cases there aren’t enough. There are your prejudices, etc.

But the big determinant is likely to be a fear of under-warning. The Warner’s nightmare is having something important happen without having given warning—not having blown the whistle loud enough and in time.

Within the loose criteria above, he has a lot of latitude and a lot of room for subjective judgment. The tendency is to overwarn—to overvalue the ominous.

At the same time, the Warner realizes that important things will happen that he will not and cannot know about. History amply proves the point. The Iraqi coup, 1958. The Berlin Wall, 1961. The Indonesia coup, 1965.

Mark well that the two matters, (1) the Warner’s built-in tendency to overwarn and (2) his record of fallibility, are well known to the Warnee.

Kent, for his audience of analysts, then turns to the world of the Warnee and points to the cultural differences that work against the effectiveness of warning analysis as intelligence analysts prefer to conduct it.

Realize that the policymaker is no dope. He reads as much intelligence as he has time for—especially in his own area of concern.

Realize that intelligence [that is, the intelligence collector], proud of its nuggets and wanting recognition for them, passes them around long before any final evaluation or synthesis by analysts is possible.

In such a way, intelligence encourages its consumers to be junior grade intelligence officers. Sometimes they get to be adept indeed.

Next, realize that the Warnee has a full time job and is not looking for extra work or needless interruption of his regular duties. His circuits are already overloaded.

Realize when the Warnee receives a warning and elects to act upon it, the least that he must do is begin some very speedy contingency planning. The way the US government works this means a lot of meeting, talking, writing, clearing cables, etc. For a minor crisis in a minor African or Latin American republic the waves [of activity] will hit 100 officers perhaps. There really isn’t anything in contingency planning that is easy and effortless.

In discussing the Warnees’ psychology, Kent noted in one draft outline that policy officials also dread not heeding a warning and getting caught unprepared. The problem: “It is a lot easier [for the Warner] to warn than [for the Warnee] to get ready [to take action].”

Kent next homes in on the consequences of the cultural divide between producer and intended consumer of warning analysis and the suspicions and distrust engendered.

To cap all this: Both Warner and Warnee know of each other’s weaknesses.

Warners know Warnees are hard to convince. They will not be warned by a hint. The thing that will really jolt them into being warned is for the Warner to push his conclusions beyond what his evidence will legitimately support. This is seldom done for good reason. It ain’t honest. It ain’t prudent.

Warnees know all about the Warners tendency to overwarn. And also about their fallibility.

The Warners’ credibility declines with warnings that turn out to be false alarms. And in the event that the Warners once hurt by a false alarm fail to warn of an important event, their credibility may be cooked for good.

In face of uncertainty and aware of the CYA attitude of the Warners, Warnees make their own judgment of [the criteria for] warnability.

Such then is the unhappy psychological relationship between those who guard the health—even the life of the state.

Kent sees the circumstances that work against effective warning analysis as “in the nature of things” and therefore resistant to change. In his earliest lecture draft, he notes, ”we are in luck if [warning] ever works.” Later, he ratchets down the level of gloom a notch: “Of course it is not all that bad, but it is bad enough.”

His lecture drafts in the end provide a barebones outline for a potential “remedy” for the disconnects between analysts and policymakers that complicate warning analysis.

Care on the part of Warners not to overload the circuits.

Care on the part of Warnees not to develop too much callous.

Above all, more talk between the two.

We do not know how Kent expanded on the issue of improved practice once in the classroom. But with his third point, Kent identifies what I believe to be the most hopeful path to more effective warning analysis. In an earlier Kent Center Occasional Paper on “Strategic Warning,” I advocated a transformation of warning analysis from an intelligence function to a governmental function. [3]

Under such a regime, the policymakers responsible for completing Kent’s warning analysis loop—believing and taking action on warnings—would join forces with analysts in determining priority issues for assessment, likely triggers of changes in momentum, and signposts of increasing danger. Whether the appropriate response to a warning was calling a contingency planning meeting or alerting US military forces, the “Warnees” would by virtue of their participation in the process have a greater stake—and thus greater confidence —in the sounding of an alarm.

This closer partnership between what Kent referred to as the two national security “trades” would also provide the analysts with much needed guidance for developing the specialized substantive expertise and analytic tradecraft that would be received with action-inducing credibility by their policymaking clients.

 

Intentions Analysis: The Danger of Too Close a Relationship

Kent had some difficulty in choosing a label for the challenge he paired with warning analysis in his lectures about “two worrisome situations in the [policymaker] relationship, which are diametrically opposed.” His goal was to choose a challenge to effective analyst-policymaker relations for which ties were “too close,” and he settled on the term “vested intellectual positions.” Since all of the case examples that Kent raised in his lecture notes concerned estimating adversary intentions, I have substituted a label—“intentions analysis”—that focuses on the analytic process involved rather than the cultural or psychological root of the malfunction in relations.

It is worth noting that as Kent developed the two issues in his lecture notes, warning analysis often involves estimating an adversary’s intentions, and intentions analysis often involves the decision of whether or not to issue a warning. In his lecture notes on intentions analysis, Kent first sets out the inherent uncertainty that characterizes estimative judgments on an adversary’s planned course of action. [4]

In any intellectual utterance there are likely to be three sorts of statements. [Those that address]:

  1. The knowable and known [essentially the facts].
  2. The knowable and unknown [secrets].
  3. The unknowable [mysteries].

The latter two statements are estimates. You can think of an estimate as a sort of intellectual structure which has: (1) a base of more or less solid factual evidence; and (2) a top of highly reasoned conjecture. To carry weight this conjecture must have a rationale built on a plausible interpolation and/or extrapolation, or an analogy or a history, tradition, etc. The conjecture must display some sort of logical consistency and intellectual integrity. If it has these qualities it will be convincing to many.

But this does not mean that the estimative judgment is correct today. Nor that if it is correct today it will still be correct next month or year.

Kent then talks about the sharply focused recognition of the fallibility of a complex estimative judgment on its “birthday,” and then the subsequent fading of this recognition.

Both producers and consumers begin by being fully aware that an estimate is a tentative judgment with odds pro and con. We in intelligence cite the odds so that “probable,” say, equals 3:1 for a judgment and at the same time 1 in 4 against a judgment. When the passage of time affords new evidence it is thus susceptible of change,

But no matter how clear everyone is about the estimate’s tentative nature on its birthday, the tentative quality recedes more and more into the background of the thinking, especially of the consumer. Consumers tend to take such judgments as “yes or no” answers. Once accepted as correct it begins to take on a life of its own.

The producer too becomes numb in absence of new evidence. In the nature of things, once articulated and agreed to in the Intelligence Community, it [the estimative judgment] is just as hard or harder to upset then it was to write in the first place.

The odds against [the 1 in 4 likelihood that a judgment will be wrong] tend to be forgotten. In the absence of compelling evidence to the contrary, it is likely to stand. Not that the judgment is not reviewed. It is. But it is hardy.

The worst case is when it is a Siamese twin. That is, when the intelligence estimate coincides with the estimate of consumers.

After this explanation of the cognitive and bureaucratic psychology of the analytic challenge, Kent’s lecture notes give illustrative examples from what he calls “ancient history.” In each case a flawed estimative conclusion “swung around a probably course of action of the USSR.” In each case, the facts—mostly, the USSR’s past practices—were well known, and the estimate was made that the Soviets would probably continue to adhere to “well-understood…past policy.”

  1. The USSR and the Mid-East in the mid-1950s [missing the advent of close Soviet ties with Egypt in 1955]. Soviet Mid East policy had been “hands off” for years. That the Soviets would move on Egypt in a big way seemed “highly unlikely.” Not only because of the general Soviet [lack of interest] in the Mid East but also there was no vestige of a hint that the Soviets would move to help a non-Communist bourgeoisie government with a strong local anti-Communist stance.

  2. The Sino-Soviet Relationship. The crux here was the unlikelihood [of a shift from alliance to enmity] when the relationship (to us at least) seemed of greatest benefit to both parties. Communist China was the Soviet blue chip in the Far East. The USSR was Communist China’s helper and protector. Intelligence was timid in announcing [a split]. Consumers were too.

  3. The 1962 Cuban Missile Crisis. The best case [for illustrating the problem] is Soviet missiles in Cuba.

The estimate and its rationale: [The 14 September 1962 estimate on the military build-up in Cuba concluded that the] Soviets were developing a defensive [military] capability. There was no evidence of an [actual or prospective] offensive weapon [deployment]. The judgment that the Soviets would probably not deploy offensive weapons [in Cuba] was based on secondary evidence. The estimate was very convincing and believed because of its reasoning based on secondary evidence [especially previous Soviet policy of not stationing offensive strategic weapons outside its borders and not risking direct confrontations where the United States held decisive military advantage].

The one doubter was DCI John A. McCone [who believed the Soviets would deploy offensive weapons in Cuba]. His problem was that he was away [when the 14 September 1962 estimate was produced]. But, and very important, [a judgment based on] intuition [and otherwise] unsupported doesn’t make converts. [That is,] if McCone had been in Washington and made a federal case of his intuitive guess, and had got the President’s ear, McCone would have had opposing him (1) the members of USIB [i.e., the Intelligence Community]; and (2) most presidential advisors including the four most important ones [who were experts on the Soviet Union]—[former Ambassador Charles] Bohlen, [former Ambassador Llewelyn] Thompson, [former Ambassador George] Kennan, and [serving Ambassador] Foy [Kohler].

In a handwritten letter draft dated 3 December 1971, Kent reports that the latter statement is based on his having asked the four experts on Soviet affairs, after the discovery of the missiles had made clear that the estimative judgment on offensive weapons deployment was incorrect, what their previous judgments on the matter had been.

They were all honorable and decent enough to say they believed the NIE when it was written.

In his draft lecture outlines, Kent summed up the malfunction of the Egyptian and China cases of misjudging Soviet intentions in language that encompasses the essential elements of the Cuban Missile case as well.

The lesson in both cases is the same: Our estimate of Soviet policy became less flexible than Soviet policy per se. If the Soviets telegraphed a change in policy—or if our indicator board showed a change in policy was perhaps in the making—we did not read it. Nor certainly did our consumers.

[The estimative judgments of both groups] hardened, so that barring incontrovertible evidence, which is rare, it becomes more and more difficult to modify, or upset, with both intelligence and the consumers [As a result] the USG was slow to react to what was going on.

As with warning, Kent concludes that the malfunction in misjudging enemy intentions when analysts and policy officials share “vested intellectual positions” is “[rooted] in the nature of things” and not open to easy fixing. His lecture outline ends on a pessimistic note.

Is there anything to be done? I doubt it. What we have is just our own particular phase of the cultural lag [that is, the limits of the human intellect].

How Kent handled the issue of improving estimative performance in his classroom presentations is, alas, a matter knowable but unknown. In lecture notes and related unpublished commentary, he was particularly wary of worst-case estimating and also of ducking the hard cases by making no bottom-line estimative judgments. He made clear in addressing the Cuban Missile Crisis that “intuitive” alternative judgments, those without a convincing rationale, would not change minds of either analysts or policymakers who had done their homework via rigorous research and conjecture.

Yet by casting the recurring malfunction in intentions analysis as in instance in which analyst-policy relationships were too close, Kent implied that introduction of healthy argumentation would be a suitable remedy to what has been called “group think.” I can then “estimate” that if exposed to the rigor the current generation of analysts has introduced to tradecraft by way of Alternative Analysis, Sherman Kent would have given serious consideration to its use in assessing secrets and mysteries.

A previous Kent Center Occasional Paper on analyst-policymaker relations indicates the range of Alternative Analysis approaches. [5] These initiatives attempt to overcome estimative inertia and cognitive limitations by changing the lens or framework through which the issue is addressed while still applying tough-minded analytic tradecraft. The goal is not necessarily to abandon but to challenge the strongly held original view.

Alternative Analysis techniques that could have been applied to assessing Soviet intentions in the 1962 Cuban Missile Crisis include:

· Devil’s Advocacy—a deliberate attempt to support McCone’s judgment by alternative interpretations of the meaning of available evidence and the implications of gaps in information.

· Risk-Benefit Analysis—an assessment of various ways the USSR might weigh the stakes in introducing strategic weapons.

· High Impact-Low Probability Analysis—implications for US interests if the Soviets had decided to deploy strategic weapons.

· Quality of Information Check—assessing the authenticity, comprehensiveness, and consistency of the evidence behind the judgment the USSR will not deploy strategic weapons.

 

Sherman Kent’s Legacy to Intelligence Analysts

In his published works and in discussions with his colleagues, Kent did not stint in his praise of the importance of the analysts’ mission. Clandestine and technical collection was important. And computers could amplify memory and calculations. But Kent let it be known at various times and in various words that he was convinced that the thoughtful analyst was, and always would be, “the intelligence device supreme” for assessing the complex national security problems confronting policymakers.

Kent also contributed to his chosen profession by defining in his writings and exhibiting in his personal practice many values, processes, and methods that still serve intelligence analysts well today—as surveyed in Kent Center Occasional Papers, Volume 1, Number 5, November 2002, “Sherman Kent and the Profession of Intelligence Analysis.”

In his final attempts at guidance, however, as indicated by his lecture notes, Kent let it be known that he and the first generation of professional intelligence analysts left tough, unresolved challenges as well as a sound legacy for subsequent generations of practitioners. How typical of Kent the teacher to leave his “students” not with textbook answers to the easy questions but with outstanding problems that demanded their fresh resolve and continued effort.


[1] This was the “Kent of the Book”—his early, published thinking on the analyst-policymaker relationship. Many of his later colleagues remembered more vividly the “Kent of the Agency,” who, out of concern for analysts’ integrity, warned, during his tenure in the Agency in the 1960s, against the seduction of analysts who got too close to powerful policymakers.

[2] Most of these citations are taken from Kent’s notes for 19 February and 15 November 1971 presentations to the Defense Intelligence School. In the many direct quotations cited in the text, the spelling out of abbrevia­tions and other minor editorial changes are made without indication. More elaborate editing to clarify Kent’s barebones sentences are enclosed in brackets. At times a sequence of quoted thoughts includes sentences from several manuscripts addressing the same subject. Interpretations of the meaning of the tersely worded outlines are based on my reading of Kent’s published works, service under him in the Office of National Estimates during 1963-1967, and interviews of his senior colleagues. Access to the Kent papers at Yale University Library is gratefully acknowledged.

[3] Kent Center Occasional Papers, Volume 2, Number 1, January 2003, “Strategic Warning: If Surprise is Inevitable, What Role for Analysis?”

[4] The citations are again taken mainly from Kent’s manuscripts for 19 February and 15 November 1971 presentations to training courses offered by the Defense Intelligence School.

[5] Kent Center Occasional Papers, Volume 2, Number 2, January 2003, “Tensions in Analyst-Policymaker Relations: Opinions, Facts, and Evidence.”

     

    Disclaimer:
    All statements of fact, opinion, or analysis expressed in Occasional Papers are those of the authors. They do not necessarily reflect official positions or views of the Kent School, the Central Intelligence Agency, or any other US Government entity, past or present.

    Nothing in the contents should be construed as asserting or implying US Government endorsement of an article's factual statements and interpretations.

    These papers have been prepared with the support of Central Intelligence Agency funds and are published with the consent of the authors.


    Posted: Apr 21, 2007 08:32 PM
    Last Updated: Nov 08, 2007 02:11 PM
    Last Reviewed: Apr 21, 2007 08:32 PM