Re-thinking intelligence analyst training

Canada has not had commissions of inquiry or reviews of analytic capability – nothing of the scale of our allies – to examine the systemic issues related to intelligence failures. Hence, there are no recommendations to review training or improve intelligence analysis. Yet our closest allies, in their response to their inquiries, are all taking a hard look at their training. And while the Canadian intelligence community is different in many ways, we do face the same risks of analytic failures and, therefore, we have as much a need to think about how to reduce that risk.

Like our allies, we must improve training and analysis. We have a golden opportunity to learn from their past experiences and their current efforts to study their mistakes. Not only do we face the same challenges, we must ensure our analysts who cooperate and exchange views with allies understand the new approaches to tradecraft used in the US, UK and Australia, and can meet those higher standards.

Who needs training? The first group is the analysts themselves, though requirements would differ from the new recruit to the senior analyst. The second is the managers of analysts who must understand how to manage the intelligence process, how to translate policy requirements into intelligence requirements, how to challenge analysis, and so on.

And third are consumers of intelligence products. Whether they are government-decision makers, parliamentarians or members of oversight committees, they also need to be educated about the probabilistic language of intelligence – they often don’t know why things are written as they are and have unrealistic expectations or misconceptions.

I believe there should be three distinct approaches to training that ideally would compliment one another. First, an integrated training program geared towards the common needs of intelligence analysts across the community. Second, internal departmental programs geared to the job specific requirements and mandates of each agency. And third, academic programs offered in universities geared to those who want to gain a more general knowledge of the business and want to study the history of intelligence – past successes, failures, the role of intelligence in international relations.

There are limits to what a training program can achieve. There are a number of inherent qualities to analysts that can neither be taught nor trained, including enthusiasm, curiosity, confidence, judgement, imagination, creativity, strategic thinking.

We cannot and should not expect even the most brilliant of analytical minds to be thrown into an intelligence analyst division without teaching him or her fundamentals of the discipline. There are two broad categories. The first deals with the common or generic competencies of intelligence analysis. The second focuses on the specialized competencies relevant to each of the different disciplines of intelligence analysis.

Building bridges
In September 2005, we launched the Intelligence Analyst Learning Program (IALP), a Canada-wide program run out of the Privy Council Office on behalf of the Intelligence Assessment Coordination Committee.

The IALP was borne out of increasing demands from analysts themselves for more and better training – a workshop hosted by the Canadian Association of Professional Intelligence Analysts in November 2004 clearly identified a lack of adequate training in a number of areas. It is intended to compliment rather than duplicate the specific needs of departments, and focuses on the common competencies. The program provides an overarching body of community training resources and is meant to facilitate cross-participation and access to the various existing specialized training courses.

We’ve recently completed our first, entry-level course (ELC) for 20 junior analysts from a variety of federal departments and agencies. Next, we intend to develop a series of more advanced two- to three-day modules for senior analysts that will take a much more in-depth look at analytic techniques, critical thinking, the use of probabilistic language, and so on.

We’re also planning a series of one-day seminars for mangers of analysts. We believe managing intelligence analysis requires a good understanding of biases, the various techniques and methodologies, the need to challenge analysis and how to do it. We’re also starting to think about how to educate our consumers, how to sensitize them to the role and limitations of intelligence.

It might be rather ambitious at this stage, but I’d like to think that we could work toward an intelligence analysis centre of excellence or an institute of higher learning for intelligence analysis.

In addition to elevating the quality of analysis, the IALP provides an incredible opportunity for culture change, to go beyond the silo mentality. It has so far proven to be a considerable community-building tool. I think we have underestimated the degree to which there is a lack of mutual understanding and appreciation for the different types of intelligence analysis. The networking that we have witnessed so far in the ELC has already contributed to creating a more closely knit community and enhancing the sense of belonging to a community.

I’m hopeful that the program will allow for the harmonization of analytic practices. By conducting joint training, analysts are sharing and spreading best practices. They are gaining a better understanding of the language used by colleagues in other departments. The program should allow each department to focus on its resources and its specific requirements, on specialized competencies, which will optimize existing training courses.

With any new endeavour, we face a number of challenges. We’re working on finding the right balance between theory and practice in each of our modules, identifying trainers with credibility who will bring clout to the program, getting past the silo mentality of individuals in the community, and more.

However, there are two much more significant challenges. The most immediate is evaluation: how to evaluate the improvement of analysis as a result of training. How do we qualify and quantify the impact of training on each analyst? Do we do it from feedback from managers or from consumers? What is the benchmark for comparison? Even more complex is assessing how improved analysis has affected the decision-making process.

The second challenge deals with resources and commitment. The Canadian intelligence community has not been mandated to take a closer look at its training practices to improve analysis. The impetus has come from within. The challenge, therefore, is getting the necessary commitment and resources to push the process through before there is an intelligence failure and before we get an inquiry proving we needed more emphasis on training in the first place.

Monik Beauregard is deputy executive director, European division, of the International Assessment Staff in the Privy Council Office.

Author: Monik Beauregard from the March/April 2007 issue published

Share This Post On
468 ad

Submit a Comment

Your email address will not be published. Required fields are marked *

Visit Us On TwitterVisit Us On FacebookVisit Us On LinkedinVisit Us On Youtube