Intelligence Analysis



Rodrigo Maciel, a Brazilian police officer and a visiting PhD candidate at CESAM, is an expert in intelligence analysis on strategical and tactical levels. Here, he blogs about handling massive amounts of intelligence analysis information, and how combining two perspectives can lead to an optimised way of making sense of events around the world. This blog post is partially based on his previous research.

 

Intelligence analysis: Information for decision in security issues

One objective of studying information science is to understand how humans and organisations collect and use information to construct meaning and to make decisions.

Before the Second World War, intelligence activity was inseparable from the exercise of power by kings and ministers. With better means of collecting information emerging during WWII, nations faced an information explosion, including a growth of intelligence activities that handled secret information. Thus, a new perspective was added into the intelligence services: not only to collect secrets but also to put together all the pieces of information. This indirect means of gathering facts is the main role of intelligence analysis, done by a set of organisations inside a nation state collecting, processing and analysing information about threats and opportunities in the international arena, and about internal security issues. Usually the process is depicted as a pyramid; the large base is composed of data and information, and the narrow peak is the output of refined meaning.

Two perspectives of intelligence analysis

Two concepts about intelligence analysis from the perspective of the United States were raised and explicated by Sherman Kent and George Pettee. Both men worked as intelligence analysts during WWII. Before their wartime activities, Sherman Kent was a reputed academic at Yale with knowledge of history and 19th French politics, and George Pettee was an economist in the Department of Political Science at Amherst College. Their different experiences generated different perspectives about the best approach for intelligence analysts. As researchers, both men based their concepts on scientific methods, but with striking differences.

Sherman Kent structured a process focusing on the refinement of all available information. His knowledge pyramid has three elements, from bottom to top:

  • Basic descriptive: A repository such as an encyclopedia, where factual data about countries would be classified, such as by population, geography, political system and military power. The CIA World Factbook is an example of this kind of analytical product, produced by the US Central Intelligence Agency.
  • Current reportorial: News reports which update the current situation with the objective of reviewing basic descriptive elements and informing policymakers about some subject that might demand immediate action. An example is the President’s Daily Brief, document produced by the CIA to give an account of world developments for the US President.
  • Speculative evaluative: When two previous sets of knowledge are related to the known factual reality. Based on these, the final element aims to create scenarios that enable future developments on the security landscape to be anticipated. A collection of declassified intelligence estimates is then available for reading, with predictive analyses that have been produced by the US intelligence community.

Even though Kent considered intelligence analysis as a mental endeavour, the main characteristic of the process he created has a focus on a wide collection of information in order to solve an analytical problem.

George Pettee’s work was concerned with how the data is collected, processed and enters the brains of intelligence analysts not as raw data, but as formed hypothesises. Such hypothesis would be manipulated through logical operations, resulting in a new premise, and repeated until analytical judgments are reached. The structure resembles a pyramid with great amount of factual data at the base, premises and conclusions at mid-level, and a cosmology of ideas on the top which offer meaningful relationships between the events and actors in the world.

Pettee argues that the process is based on a new arrangement of already-known data, with constant evaluation of the premises. He thought this would result in supplanting former conclusions that were based on a lot of data but which included findings which were subsequently found to be not so.

Understanding the two processes

How could these two processes then help us face the massive amount of information in our modern lives, especially in the context of safety and security? First, we could consider both of these concepts as complementary. Sherman Kent proposed a general process which helped to understand how to organise information in ways that enables multiple agencies to have a common data structure and language for communication. From an intelligence perspective, the process could be optimised so that specialisation based on source of data could fulfil the encyclopaedic repository.

George Pettee provides a sense of humans’ propensity for cognitive errors – and how to be conscious – which could help to improve analysis of intelligence. More recently, this cognitive approach has begun to be formalised in a book published by the CIA, Psychology of Intelligence Analysis.

So both processes can be combined for the treatment of information, particularly during this time of turbulent information. New ways to collect, store and process data are available and we should use every technological tool available to extract meaning of the deluge of information. Nonetheless, it is important to note the mental processes that are incorporated into these new technologies.

Even algorithms incorporate premises and assumptions that depend on the type of data or the analytical problem at hand. For instance, in network analysis there can be no optimal algorithm for all possible community detection tasks, mainly because the concept of community in a network is defined on theoretical grounds. Consequentially, because of the vast variety of vendor and analytical tools available for business, security and intelligence agencies, it is important to know the working mechanisms embodied in the software. Otherwise, we could be substituting our own mental processes with those of the algorithm creator.

To conclude, the best combination of these two perspectives is to use Kent’s conception of storing and processing the maximum possible amount of information, using the safeguards proposed by George Pettee to evaluate over and over again the premises and conclusions that humans and computers get from that information.

Image: Artificial intelligence by Many Wonderful Artists, Flickr, 1.0.

 

If you enjoyed reading this, try another one in our series of blog posts about aspects of safety from the Centre of Excellence in Public Safety Management (CESAM) at Rotterdam School of Management, Erasmus University (RSM). It is intended to act as an introduction to the Centre’s work; to promote and foster the professional development and management of public safety organisations, and to give CESAM members a platform to share their observations and experiences as academics and citizens. Please see our webpage to find out more.

 

Type
CESAM blog