Advances in scientific knowledge, translated into new technology, have made previously unmanageable intelligence tasks feasible and greatly increased the speed at which intelligence professionals perform traditional activities. Improved sensors, transmission capabilities, and analytical tools deliver unprecedented volumes of information and processing capabilities to the intelligence community and its customers, military and political decision makers. Processes that used to take days or weeks now take only seconds. Activities carried out at dispersed locations throughout the world can be managed centrally, ensuring coherence in the information delivered and a rapid flow of intelligence between the field and administrative office.
And yet, problems that have always plagued intelligence seem impervious to the information revolution. A 1990 report titled Whence and Whither Intelligence, Command and control? The Certainty of Uncertainty explains: "The continuing availability of ever smaller, faster, cheaper, better tools for information processing gives us the illusion that throwing these tools at perennial problems of intelligence, command and control can solve these problems once and for all. In reality, the new tools continuously trigger readjustments in numerous interlinked balancing acts ... The endless frontier of complexity accounts for our simultaneous sensations of both progress and deja vu." For 22 years, the Seminar on Intelligence, Command, and Control has brought military and civilian leaders to Harvard University. Their opinions and anecdotes illuminate the persistent balances that the intelligence community and its customers must keep adjusting. While technology may tip some of these balances in one direction or another, it has not eliminated the conflicting forces that must be balanced--and never will.
Supply and Demand
The intelligence professionals of 20 or even 10 years ago might well have considered the technology available to the intelligence community today a dream come true. The information revolution has provided the intelligence community and its customers with superb tools, including supercomputers and software for intelligent information processing, high-resolution satellite imagery; and sensors that penetrate natural and manmade barriers to identify targets or link directly to precision- guided weapons. In-Q-Tel, the for-profit corporation spun off from the Central Intelligence Agency (CIA) to encourage commercial development of intelligence-related technology, has been designed to connect the intelligence community to some of the most innovative thinking in the private sector.
The result in many areas is that the intelligence community and its customers no longer suffer from information scarcity but from information overload. Analysis must cover enormous quantities of data, in which valuable information may at best be implicit. Even so, many decision makers who remember the information scarcity of the Cold War era still demand more from the intelligence community, thereby perpetuating approaches more appropriate to penetrating the Soviet Union than to dealing with terrorists.
Efficient search engines and vast databases certainly allow the intelligence community to use that mass of information to supply objective intelligence almost instantaneously (e.g. "Which airfields in Country X can accommodate C-130 transport planes?" or "What are the physical characteristics of the enemy's new anti-aircraft radar?"). Geospatial information systems can combine classified satellite imagery, digital terrain elevation data, hydrographic information, aeronautical information, and foundation feature data to create maps that give deployed troops displays not only of the natural environment but also of manmade features and recent activity in an area. In many tactical situations, such intelligence products suffice to meet the operational user's needs.
If intelligence professionals were geographers or historians, today's technology might be a panacea. But, as James M. Simon, 135 assistant director of central intelligence for administration, points out, "Intelligence is not history; it is secret information of actionable use. Intelligence must also try to divine intent, or future developments, where facts are not available." While electrons may move at the speed of light, human understanding does not. No matter how rapidly electronic information reaches an intelligence analyst, that analyst must still read, digest, and act upon it. That information must be winnowed, cross-checked for accuracy, analyzed for significance, and finally disseminated in a form appropriate for use by the intended customer.
By and large, therefore, high-speed computers and networks, drawing on the products of the latest generation of sensors, allow the intelligence community to perform its traditional tasks faster but have not changed these tasks fundamentally. Modern technology has not only increased the amount of information available to the intelligence community and the speed at which it can--and must--analyze, tailor, and disseminate it but also the speed at which customers demand intelligence products. The intelligence community must deliver intelligence as rapidly as forces move and faster than an enemy can deliver intelligence to its forces. Organizations threatened by an information attack need immediate warning. Thus, each improvement in technology brings about an equivalent increase in demand. The balancing act remains.
Knowledge is never complete, so the intelligence community must constantly balance the imperative to warn of an impending danger against the risk of overreacting on the basis of insufficient evidence. Information technology compounds this problem. Charles Allen, a former US national intelligence officer for warning, describes the dilemma facing intelligence analysts: "What policymakers and warning analysts must continually relearn is that there is a tradeoff relationship between the probability of false alarm and the probability of accurate warning. If the consequence of a false alarm is 'missiles away!' or the belittlement of an analyst who cried wolf, then the [analyst's warning] threshold is inevitably going to go up." Technology undoubtedly brings the analyst vastly more data to sift for relevant information, but the amount can be overwhelming and the tools to help in the sifting process have not yet matured. The same intelligence community that can deliver high-resolution images of the most remote parts of Afghanistan misidentified the Chinese embassy in Belgrade with disastrous results. Admiral Thomas R. Wilson, director of the US Defense Intelligence Agency, recalls, "I was the J-2 (director of intelligence) on the Joint Staff, so I was the one who showed the picture of the Chinese embassy to the president of the United States (among 900 other pictures I showed him) and said, 'We're going to bomb this because it's the Yugoslav department of military procurement.' We had good sources that said that's what it was; another agency (not my own) got that information and gave us that identification, and our databases were not able to find that mistake in targeting. Even though we had people in the systems who knew that building was the Chinese embassy, it had not been entered into the database." More recently, information that might have prevented some of the September 11 attacks apparently existed somewhere within the vast quantity of data collected by the intelligence community, but the systems for using such i nformation have lagged far behind the ability to collect data.
Allen also points out that "sometimes very good intelligence is unable to affect policy decisions. In many cases, good intelligence can lead to little action other than hand-wringing. We may simply have no options, or we don't have the political will to act on that intelligence. It's as though intelligence at times is akin to medical screening for an incurable disease." The current controversy over the intelligence community's warnings regarding possible suicide hijackings and the actions taken (or not taken) by the White House provide an excellent, albeit tragic, illustration.
In the 1988 seminar, Brigadier General Frank J. Breth, then director of intelligence for the US Marine Corps, stated, "There's a terrific intelligence capability in space. It's nice to have the president receive the product but it's also nice to have that forward-deployed Marine out there know what that system can do for you and get him those products." This illustrates the continuing tradeoffs that the intelligence community must make as it provides intelligence products. Should it "push" selected intelligence to its customers, or should it provide a large resource and allow customers to "pull" what they want? The technology to do both improves continuously, yet the problem remains. Users still complain that the information pushed to them is not what they want or is not in a form they can use. On the other hand, knowing what to pull and how to pull it requires a rather sophisticated user--and that user may overlook an important resource.
Added to that is the problem of determining who the users are. The forward-deployed Marine and the president may need some of the same intelligence information, but they need it at vastly different levels of detail and in vastly different forms. Automated information processing helps the intelligence community generate different products from the same data but does not resolve the perennial questions of what to share with whom and how.
Throughout the 22 years of the seminar, speakers have complained about the incompatibilities among the communication and information systems used by different military and intelligence organizations. The intelligence community does not decide what equipment its military customers should buy, but it feels the repercussions if those customers are unable to avail themselves of its resources because of incompatible hardware or software. If anything, the pace of technological change has exacerbated this problem because the generations of technology succeed each other so fast that systems within a single organization may not be compatible, let alone systems used by different organizations. According to retired US Air Force General Robert A. Rosenberg, "National, theater, coalition, commercial, and organic reconnaissance and surveillance are done by totally separate processes that don't leverage each other for battlespace dominance, and important sources of information are not being used because they are not part of 'classic' intelligence, surveillance, and reconnaissance.
Retired US Army Colonel Kenneth Allard, a former special assistant to the chief of staff, pointed out in 2001 that useful business intelligence usually comes from independent sources because in-house market researchers usually tell their managers what they want to hear rather than what they should know. The same argument applies to the intelligence community. Intelligence organizations should be independent of politics and personalities, concentrating on delivering the best possible information and analysis regardless of how welcome that information may be. However, most decision makers, unless they are exceptionally open-minded, eventually cease to pay attention to intelligence purveyors who do not address their top priorities.
Moreover, the intelligence community must balance the demand to meet immediate customer needs against the need to provide strategically relevant analytic products. Especially when a particular threat dominates the national agenda, as international terrorism does today, intelligence agencies find it difficult to allocate resources to in-depth research on a threat that might arise in the future, for example, from a currently quiescent country. Keith Hall, then US deputy assistant secretary of defense for intelligence and security, argued in 1994, "The intelligence community gets criticized frequently for not paying attention to its customers, but I think we pay too much attention to our customers. It is what is in our customers' inbox that guides what we do when we come in to work in the morning ... As a consequence, we don't have a sense of where we stand in getting the total job done in terms of the interest of the country at large."
James Simon, US assistant director of central intelligence for administration, echoes this viewpoint, arguing that the proliferation of intelligence organizations in the military departments and in the unified and specified commands undermines the ability of nationally oriented intelligence agencies to task collection assets and gain the data they need to conduct strategic analyses. This problem is independent of technology, except to the extent that sensible tasking of intelligence assets and improved processing techniques increase the ability to generate multiple products from a given source.
The Human Element
Every seminar speaker who has addressed intelligence issues has agreed on three things: the technical advances are impressive, far more remains to be done, and the United States desperately needs to improve its human intelligence capabilities. In short, the best technology and the best technical intelligence will never replace the need for human intelligence. The dismantling of much of the US human intelligence apparatus during the administration of US President Jimmy Garter had a disastrous effect on the ability of the United States to collect intelligence, especially against non-state groups such as Al Qaeda. While there can be no doubt of the difficulty and risk involved in infiltrating such groups, the Taliban's acceptance ofJohn Walker Lindh demonstrates that it is not impossible.
As it collects intelligence, the intelligence community must constantly maintain a balance between reliance on technical means and more traditional sources. This challenge is complicated by an unfortunate byproduct of the esoteric nature of today's high technology: a tendency among some leaders to lend greater credence to technical intelligence than to intelligence collected by other means. Decision makers who do not understand the system seem to believe that, "If it came from that fancy signals intelligence system or that satellite, it must be true." Discussing the notorious discrepancies between national reports of tanks destroyed during Desert Storm and locally generated reports, former Director of the US Defense Intelligence Agency's National Military Intelligence Collection Center John Leide recounted: "The folks back in Washington were using satellite imagery, and only that, to take the tanks. We were using satellite imagery. We were using signals intelligence. We were using defector reports. We were us ing pilot reports. We were using mission reports. We were using RF-4 photography. We were using U-2 photography. We had a whole plethora of things we were using, and they were using a one-dimensional asset."
Technical intelligence clearly requires the same verification and cross-checking as open-source intelligence or intelligence collected by an agent in the field. Accuracy and insight result from corroboration by multiple sources. This consideration played a role in the design of Intelink, the intelligence community's classified counterpart of the Internet, whose peer-to-peer networking capabilities allow analysts to discuss and compare sources and conclusions, thereby improving the quality of the intelligence posted. Yet despite the undoubted benefits of using Intelink, not all analysts avail themselves of it, and it cannot eliminate all inconsistencies or errors.
The intelligence community faces special pressure as custodian of the nation's secrets and the means by which they are gathered. At the same time, the intelligence community and its customers recognize that security limits operational effectiveness and can be counterproductive. For example, although technical advances have enabled such approaches as multilevel security at the desktop level, such systems remain cumbersome and often prompt users to circumvent them or simply not use them. Better encryption techniques limit authorized access as well as unauthorized access.
Several military speakers have complained that the intelligence community has tended to use protection of sources and methods as an excuse not to share information that could be essential to other users, and would in fact endanger little besides somebody's turf. Rosenberg described a telling example: "Several weeks after the O'Grady shootdown [in which a US Air Force pilot was shot down over Bosnia in 1995] a French aircraft got shot down ... As part of our outbriefing to Secretary [of Defense William] Perry and [Director of Central Intelligence John] Deutch, I laid on a conference table ... the one lousy photograph that the SOF [Special Operations Forces] helicopter crews had of the battlespace where that downed French airplane was and where we were going to have to send kids to put their lives into danger so they could recover our allied friends ...
"Next to that one lousy photograph I had also laid out all of the electro-optical infrared radar and every other marvelous piece of information they had in Washington, DC, of that same battle scene. I said, 'Bill [Perry], this is a criminal act. There's no reason why you and the president and [National Security Advisor] Tony Lake and [Secretary of State] Warren Christopher shouldn't have this, but there's also no reason why the kids whom you are willing to sacrifice in the name of freedom shouldn't have it also.' Up until that time, there were all kinds of restrictions on how we could get national foreign intelligence information out into the hands of our warfighters ... Every time we fail to share information with coalition partners we put our own forces at risk, because we're operating from a different sheet of music. Every time we overclassify information and don't put it into our own warriors' hands, we risk their lives too."
However, Rosenberg agreed that certain sources and methods require the closest possible protection, and Rae Huffstutler's 1988 comment regarding protection of human sources will always remain valid: "If you aren't careful about compartmentation, you're going to lose your source in a snap of the fingers. He's going to popup, be quoted in the newspaper, and he's going to be dead. I've seen it happen, and it can happen within days of an article hitting the press."
To its credit, the intelligence community has engaged in some innovative efforts to maintain essential protection while making more information available to certain users. Recognizing that data collected by national technical means could prove valuable for environmental research, the intelligence community established the Environmental Task Force (ETF; later continued under the name MEDEA) in 1992. ETF/MEDEA teamed some 35 cleared scientists with members of the intelligence community in an unprecedented effort to devise ways of creating so-called fiducial products for researchers without revealing sensitive details about collection methods and to task national collection assets to gather data useful to scientists as well as to the intelligence community. Yet such efforts remain the exception, and the intelligence community must continue to balance sharing versus security in a context where each choice can affect national survival.
The Balancing Act
No matter how good, fast, or innovative technology becomes, people will never have "enough" intelligence. The huge improvement over the last 10 years has simply whetted users' appetite for more, better, and faster information. In the 22 years of the seminar, every speaker has lauded the improvements in intelligence, and every one has stated that those improvements are not good enough. Users have come to expect a higher level of information; thus, rather than marveling at the progress made, they concentrate on the imperfections. Their continuously rising expectations are fed by a very simple phenomenon: there are no limits to complexity. No matter how accurate a picture the intelligence community delivers, there is always a picture with higher resolution. No matter how many high-resolution pictures users receive, they always want twice as many, and they always want them faster.
As US Air Force Lieutenant Colonel Gregory Rattray explains, "By enabling some things to be done in fractions of a second that used to take hours or years, technology gives the illusion that you can have your cake and eat it too. But the interplay of measures and countermeasures soon speeds everything up. So tradeoffs that once were measured in hours now get measured in seconds. That creates a need to reset the balance on a different time scale than before."
The conflicting demands facing the intelligence community will never disappear. There will never be a techno-fix for the intelligence community, any more than there will be for banks, airlines, or individual citizens. As science continues to advance, the intelligence community will always find innovative methods for satisfying customer needs. Yet these advances will simply require adjusting the same balances in an ever-rising spiral; equilibrium will remain unattainable.
MARGARET S. MACDONALD is Senior Editor at the MITRE Corporation in Bedford, Mass. ANTHONY G. OETTINGER is Gordon McKay Professor of Applied Mathematics and Professor of Information Resources Policy at Harvard University.