To be uncertain is uncomfortable, but to be certain is to be ridiculous – Chinese Proverb

I have written before about the importance of certainty in our lives. We rely on certainty and related habits for morning ablutions, driving to work, withdrawing savings from bank accounts, schooling for our children, and so forth.

A degree of certainty is required for most of us in our jobs. Our seniors establish processes to deliver outcomes for the organization, which is for the same reason we employ habits in our personal lives – to reduce risks. But we also enjoy and pursue special projects and new opportunities that offer change and hopefully personal growth. And while some of us are serious risk-takers as seen in the Winter Olympics, most of us are not when it comes to ourselves personally, including putting our jobs in jeopardy.

This all suggests that certainty is desirable because it establishes our expectations, guides us through much of every day and eliminates a significant amount of stress from our lives by avoiding risks. And yet, unexpected incidents occur to remind us that life is uncertain to some degree: pandemics, blizzards, tainted food, fires and car accidents. Some of these we have prepared for, but others we are less ready to address. 

When we start to consider complex weapon systems platform acquisition projects worth billions of dollars, members of the Canadian Armed Forces (CAF), Ministers and taxpayers expect a high degree of certainty around budget and timely delivery almost as entitlements. The Chinese would say we find comfort by expecting certainty in the performance of these expensive undertakings.   

Few of such projects have been delivered to the CAF effectively (with no failures once commissioned into service) or efficiently (on time and within budget). The current government is establishing the Defence Investment Agency (DIA) to improve the performance of the related weapons platform procurement systems, all of which significantly exceed the minimum cost of the DIA’s project authority of $100 million.

The challenges facing the DIA are enormous, as identified in countless treatises by learned observers, academics, past practitioners, journalists, officials of the Office of the Auditor General and occasional parliamentary committee reports.

While the numbers of strategic and tactical shortfalls are significant, one stands out for me as the top priority facing the DIA within our government ecosystem – the expectation of certainty based on risk aversion that is baked into the federal government’s culture.

A Wise Man Speaks

Having spent a decade with a portfolio of complex weapon systems platform acquisition projects, I have retained an interest in the subject after retiring and at last having time to learn through study and reflection.

Recently I ran across an article entitled ‘Life is Probability’ by Robert Timothy Leake, the founder of the RT Leake Group. The article and others on his website provide an interesting perspective that addresses the concept of certainty in ways that I believe speak to Canada’s military procurement system.

Edit: https://www.rtleakegroup.com/blog/life-is-probability

What follows are quotations from the article. They are about life in general, but they offer lessons for those who engage in complex endeavours such as weapon systems platform acquisition projects:

  • We are taught, either explicitly or implicitly, to treat life as a system that can be controlled: make the right plan, gather enough information, reduce risk and execute well. If you do those things, outcomes should follow. Except they don’t.
  • When a baby falls while learning to walk, it isn’t failure – its information. The same dynamic governs any skill that requires stability in a changing environment. Hovering a helicopter, for example, isn’t holding still. It’s continuous correction, never perfectly right, always responding to drift.
  • Human performance doesn’t grow by eliminating uncertainty. It grows by engaging it. All human behavior and skill develop this way: through exposure, correction, and gradual calibration inside uncertainty.
  • Probabilistic environments don’t reward perfect plans. They reward timely engagement… Competence, in real systems is not prediction. It is ongoing adjustment… In real systems, the rules shift, conditions change, learning carries forward, and future moves are shaped by interaction.
  • The goal is not to eliminate risk. It is to remain exposed enough to learn, while protected enough to continue. Experienced people notice when conditions are shifting, sense when delay is becoming dangerous, act earlier than comfort would suggest and adjust faster after contact with reality.
  • Most leadership failure doesn’t come from recklessness. It comes from over-investment in control. Leaders try to eliminate uncertainty before acting. But experienced leaders don’t expect certainty. They expect variation, and they don’t confuse a single outcome, good or bad, with proof… You are accountable not just for outcomes, but for whether you acted when action was required, whether you stayed engaged long enough to learn, and whether you adjusted as conditions changed.
  • Probability tempers explanation and sharpens attention.

Application to the DIA

The author’s premise is that certainty is the wrong objective. Yet consider an attribute of the acquisition system for weapon platforms that the DIA will inherit – a master class in risk aversion, driven by the belief that control and certainty are possible. This leads to the following examples of behaviours that are counterproductive to minimizing harm to desired outcomes:

  • the absence of honest transparency without spin (e.g. on project timelines and costs as announced at project launch which mislead);
  • contract lawyers focused on zero risk contracts (transferring all risks to the industry) and avoiding amendments when complexity guarantees the need to do otherwise;
  • failure to embrace structured collaboration based on relationships across the contract divide that might put prudence and probity at risk;
  • governance believing that information briefings and control though red tape assure certainty in results;
  • static risk registers and the necessity for mitigation strategies which create complacency and the illusion of reduced risk;
  • up-front plans out years as required for project approvals, which are entirely assumption-based;
  • process growth for every realized misstep to avoid future risk, which creates greater process complexity debt that creates more mistakes and more remedial processes – fail, add process, fail again and repeat;
  • avoidance of the risks of innovation which preempts trial-and-error improvement based on emerging techniques and other nations’ approaches to navigate complexity; and
  • confidence that atrophies without embracing broad continuous expectation adjustment during project execution.

The required changes are obvious. The DIA culture should shift to one that expects continual change and the unavoidable exposure to the inevitable risks of such complex acquisitions. Such a culture would expect risk as unavoidable and adapt and learn while being protected by seniors.

Beyond government practitioners, the defence industry cares as well. These government manifestations of risk aversion delay contract awards and amendments, shape straitjacketed contracts preventing flexibility and routinely lead to late deliveries and the related cost growth that damage reputations. In reality, significant and unpredictable challenges always emerge during implementation by the Prime Contractor’s consortia. Therefore, rectifying this adverse reaction to risk by government is as important to defence industry as it is to the DIA’s probability of success.

The Importance of Probability

For clarity, if probability is the likelihood of something happening’, certainty (defined as ‘a fact that an event is definitely going to take place’) has a 100% probability. Mr. Leake’s article implies that life is about probabilities less than 100%. And since life and projects are not certain about events happening, there is a probability of some desired outcomes not occurring.

This is not an uncommon thesis. Novels written by Jeffery Deaver feature a main protagonist named Colter Shaw who navigates through life by making all decisions based on the percentage chance of various scenarios occurring.

The scope of this article does not allow a fulsome exploration of the application of probability analysis to complex weapon systems platform acquisition projects. However a few considerations are offered:

  • The key to assessing probability is the amount and assured validity of information gathered about consequences of various decision options available.
  • With foreseeable risks, there are some tools that provide guidance. For example the common confidence level – often called the ‘sweet spot’ – for proceeding with business decisions is a probability of 70%.
  • Because effective probability employment in decision-making is based on the availability and accuracy of relevant information, a significant challenge exists with very complex data sets as found when acquiring, designing and constructing unique weapon systems platforms, where the data set is huge and often significantly assumption-based. And where schedule and cost estimates are required, Monte Carlo methods can be employed to generate probability confidence levels.
  • Also important in such projects are the continually emerging risks – many of which are not foreseeable until the weak signals of an unexpected event are observed. In huge work breakdown structures, it can take many weeks to assess the consequential impact as probabilities of changes to key aspects of plans.  
  • Also important is the risk tolerance of those governing these projects as translated into probability thresholds – something often only determined as decisions are made, and in democratic governments typically pegged very low.  
  • Finally, great care must be taken to identify significant changes to the data used to set expectations, by routinely monitoring current ground truth.

Put bluntly, uncertainty creates the need for probability analysis, but producing such assessments is no easy process with most military platform acquisitions.

The Challenge Ahead

The challenge as the DIA stands up is not that there will be no certainty to project outcomes. Every citizen in Canada knows what their grandfathers knew – certainty only applies to death and taxes.

The most likely initial DIA challenge is the public’s expectation of much improved outcomes. Because the Agency has been created to improve the performance of ‘big’ military procurement projects, the DIA will be expected to routinely deliver effective platforms in a timelier manner to the CAF – an expectation based on the myth of certainty of knowledge.

Will the DIA be allowed to transparently admit reality upfront and to brief periodically about the emergent issues, the need to adapt and the things that were tried but did not work? In other words, will expectations be set properly at project launch and subsequently managed to educate all Canadians, while gradually improving performance inside the unavoidable uncertainty of each project?

We can hope that the probability of ‘’yes’ answers to both questions is well above 50%, even though my dated experience would estimate it is hovering in the single digits.

Only time will tell whether the DIA is perceived as successful or just another government bureaucracy.