In Honor of Daniel Kahneman and What his Work Means for Project and Risk Management
His less known yet more curious words and insights for Project and Risk Management, and AI Risk Analytics Systems.
As someone who works in or manages projects you're making decisions every day. For some decisions, you rely on objective facts to best navigate the project, for others you rely on common sense, mindsets, intuition, or group consensus, that may leave you with a sense of subjectivity and awareness of potential biases. Phenomena like availability bias or groupthink bias are well known, but despite being aware of them, finding reliable steps to reduce them in making challenging decisions seem elusive, and indeed it is.
Two weeks ago, Daniel Kahneman (1934-2024), the distinguished professor of psychology and behavioral economics passed away, Kahneman dedicated his life to understanding errors in human judgment. He coined terms and concepts such as the intuitive System 1 and the elaborative System 2 as parts of our brain, or else, you may have heard of his prospect theory and loss aversion for which he won the Nobel prize in economics, where he posits that the pain of loss is harder to bear than the pleasure of gain.
We are dedicating our first article on EPM Research substack to the legacy of this great mind, and his little less known words and insights with major implications in project and risk management. Equally important, is the implications of his work in creating AI systems and Risk Analytics applications, as one way or the other, you use expert judgement as input or convey insights as output about probabilistic phenomena.
Much of this brief article is a review of his book “Thinking Fast and Slow” [1]. However, let us start with one of his transformational ideas from his 1977 paper with Amos Tversky, where they counted three characteristics of errors in human judgment and predictions: First, errors are systematic rather than random, second, they exist in all humans regardless of expertise, third and most importantly, they remain present even when you are fully aware of them and unless you learn to actively adjust — not much different than optical illusions [2]. Let us get into his universe.
Thinking Fast and Slow: System 1 and System 2
Consider my favorite puzzle of the book (pg. 44).
Kahneman introduces the concepts of System 1 and System 2 as two distinct modes of thought in human judgment and decision-making. System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. It encompasses our intuitive and instinctual reactions, drawing on learned associations and heuristics to navigate daily tasks and decisions efficiently. In contrast, System 2 requires conscious thought and effort, involving deliberate analysis and reasoning. It's activated when we engage in complex problem-solving, make thoughtful decisions, or focus on tasks that demand attention. He clarifies that System 1 and 2 are fictitious characters and not systems in “standard sense” of systems with distinguishable parts, but the analogy helps in explaining many of the errors and biases in human judgement.
If your answer to the previous question was 10 cents, chances are that your elaborative System 2 did not take over, and to engage System 2 you may just need an extra bit of effort to arrive at the correct answer of 5 cents. Integral to this idea is that System 2 requires more efforts, our brains are controlled by lazy controllers aiming to minimise the efforts and hence avoid using System 2 as much as possible. Therefore, if you are a busy project manager with a “cognitively busy” routine, you are - in principle - more prone to errors of System 1. Derived from this he concludes that states of “cognitive ease” may cause us to resort to System 1, and therefore, making issues, discussions, or debates a bit — maybe unnecessarily — complex to achieve “cognitive strain” might be a strategy to activate System 2 thinking in a group.
Kahneman emphasises that intelligence is not merely the ability to reason. In fact, the System 1 ability to rapidly find relevant information in memory is a major component of intelligence. Therefore, there is no hierarchy of importance between the two systems, both are necessary and important parts of our brain.
What You See Is All There Is: WYSIATS
Let is consider another puzzle from the book (pg. 88).
Alan: intelligent—industrious—impulsive—critical—stubborn—envious
Ben: envious—stubborn—critical—impulsive—industrious—intelligent
Our swift and often scarce-information-driven world is suited for System 1 to thrive. System 1 is particularly capable of crafting coherent narratives from minimal data with ease. This inclination of System 1 is often rubber-stamped by a "lazy controller" of System 2, which, while capable of methodical analysis, tends to validate our intuitive beliefs if not pressed. He coins this idea as What You See Is All There Is, or a rather weird acronym of WYSIATI, referring to the associative memory function that is adept at jumping to conclusion. He elaborates that “jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking.” Just like how you may have jumped to conclusion to hire Alan in the previous puzzle, while Alan an Ben have the same qualities, just written in a different order.
Kahneman argues WYSIATI is at the core of many types of human errors and biases such as: confirmation bias, availability bias, overconfidence, base rate neglect, and framing effects. WYSIATI highlights the importance of base rates, and anchors, and what he calls the “outside view” when eliciting probabilistic information.
Inside vs. Outside Views, Singular vs. Distributional Information
To adjust for WYSIATI, Kahneman refers to another cognitive bias as Anchoring in which individuals rely heavily on an initial piece of information (the "anchor") when making decisions. Even if the initial information is arbitrary or unrelated, it can significantly influence subsequent judgments and estimates. While valid anchors could help in moving away from WYSIATI, it is evident that anchoring could be used intentionally or unintentionally to mislead, and in fact it is.
One way to add more information is to replace the single point of anchor with the distribution of potential peer data points. This way not only you make sure your decision are data driven but as well gain insights into the inherent uncertainty in the data distribution.
In his paper on 1993, Kahneman discusses the critical distinction between singular information (the inside view) and distributional information (the outside view) in forecasting. Let’s consider a project as an example. He refers to the inside view as the known specifics of the current project at hand, while the outside view draws on the broader outcomes of similar previous projects. He concludes that the planning fallacy—our tendency to underestimate time and costs— is an inescapable consequence of neglecting the outside view [3].
Indeed, in many project domains, the early estimates of projects stem from benchmarking data of similar cases (i.e., the most inclusive reference class), and there are established methodologies for adjusting for differences in size and capacities (you may have heard of Lang Factors). However, the planning fallacy creeps in through the procedural steps and budget approvals. To always keep the outside view present, especially at the points of major decision making, Flyvbjerg et al. proposed the methodology of reference class forecasting (RCF) [4].
A large enough reference class, in principle, will include the variabilities within all project types and contexts resulting in enough representations. Kahneman summarizes that the outside view forms the “baseline prediction” and serves as an anchor for forecasts in absence of any inside view (case specific) information. He continues that the estimate should be adjusted “away from the mean in the appropriate direction” based on case specific information and factual evidence of the riskiness of the case (pg. 248). This means that the outside view should not entirely dismiss the application of the inside view, especially when it comes to course correction during the project execution. In a previous paper, I proposed a methodology for "positioning" a project away from its baseline prediction based on the outside view by incorporating case-specific information from the inside view, achieving preliminary success [5].
The idea of updating the baseline prediction of outside view based on case specific evidence from inside view bears a close resemblance to Bayesian probability updating where the reference class is our prior knowledge. This is a major part of our research in EPM to enable Bayesian learning in different parts of projects. In his book “How to Measure Anything,” Doughlas Hubbard, has proposed simple methods to conduct such Bayesian probability updating [6].
Conjunction Fallacy: Scenarios Not Predictions
Another important concept from Kahneman’s work, significant for us as project professionals, is the idea of the Conjunction Fallacy. This often skews risk assessments in construction and project management where specific, combined scenarios are perceived as more likely than broader, singular ones. This bias urges decision-makers to prioritize the mitigation of detailed, yet improbable risks over more probable ones, potentially leading to suboptimal resource allocation.
Recognizing and decomposing this fallacy is crucial for a more effective and efficient risk management, which requires collecting causal information within the risk registers. Consider the following nuanced risks statements within a construction site:
1. Workers sustaining head injuries due to not wearing proper PPE.
2. Workers sustaining head injuries due to not wearing proper PPE in the underground channel.
While the correct action here should focus on the use of proper PPE, the conjunction fallacy may shift our attention to improving passageways in the underground channel, which, important as it may be, represents the less probable risk.
A specific type of conjunction fallacy occurs when extensive efforts is spent building or discussing possible scenarios, effectively changing the focus of discussion to WYSIATI. Consider the following two statements:
1. Rising global temperatures will lead to more extreme weather events.
2. Rising global temperatures will lead to more extreme weather events and mass migration.
While mass migration is an important phenomenon to consider, the first statement is certainly more likely. It is therefore always beneficial to remember that scenarios are not predictions, as Peter Tertzakian often mentions in their intriguing Arc Energy Podcast [7].
Randomness and Regression to the Mean
In chapter 17 of the book Kahneman describes a significant fact of the human condition: “the feedback to which life exposes us is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty,” or as he describes it, there is a structural “regression to the mean” in out interactions.
The phenomena of overlooking inherent randomness in events have significant implications in coaching, parenting, and certainly stock picking. By failing to recognize this randomness, we often incorrectly link actions to outcomes, leading to non-regressive predictions. Kahneman, therefore, suggests adjusting our predictions toward the average to account for this oversight.
In project and construction management, missing the effects of randomness and regression to the mean can have serious consequences, affecting a wide range of predictions and actions within a project, such as trend and change management, staffing plans, vendor delays, and even earned value management.
This issue becomes particularly acute in the creation of AI and statistical learning models. Many models fail to account for the randomness and noise within their data by easily memorising and overfitting the training data. Based on my observations, maintaining a healthy skepticism towards any AI with over 90% accuracy is wise, unless it has undergone a thorough examination of its residuals and bias-variance plots.
Limits of Intuitive Judgment
Eventually, the question becomes as how and when we can trust our intuitive judgment. To explore this question, Kahneman, a skeptic of human intuition, engages in an interesting collaborative journey with Gary Klein a psychologist and a proponent of the reliability of human intuition that took about a decade.
In their final paper, titled “a failure to disagree”, they recognize that experts can often make remarkably accurate snap judgments in domains where they have extensive experience and practice (opportunity of learning), and where patterns are regular and predictable (clear feedback mechanism).
This dual requirement highlights that for intuition to be trusted, experts need a domain with stable, learnable patterns—such as those found in chess or firefighting—and the opportunity to refine their skills through consistent practice and clear feedback. Kahneman further suggests that in environments if uncertainty or "noise," algorithms often outperform human judgment.
The extent to which construction tasks, project sites, or the multi-year execution of mega-projects meet these criteria is debatable. However, it is crucial to consider these factors every time we seek expert judgment on the severity and likelihood of a risk.
His Legacy
Daniel Kahneman's journey is marked by humility and a relentless pursuit of truth. His legacy extends far beyond his contributions to decision sciences and behavioral economics. By engaging deeply with critics, Kahneman is a great example of the academic spirit of curiosity and quest for collaborative research.
His book, "Thinking, Fast and Slow," not only translated his lifelong research for non-specialists and hence paved the way for people like me to better understand this field and get interested. But also sparked a publishing trend, inviting a broader audience to explore the intricacies of research biographies outside their areas of expertise.
Any attempt to summaries the wisdom contained within his book falls short. Kahneman's masterpiece invites readers on a personal journey through its insights, challenging, enlightening, and enriching their perspectives. If you're considering a gift for someone, I can share from experience that "Thinking, Fast and Slow" is a great option to consider.
If you know someone, perhaps a project risk manager, who might be interested in this, or benefit from it, please consider sharing it with them and asking for their opinion.
References
1. Kahneman, D. (2011). Thinking, fast and slow: Macmillan.
2. Kahneman, D., & Tversky, A. (1977). Intuitive prediction: Biases and corrective procedures.
3. Kahneman, D., & Lovallo, D. (1993). Timid Choices and Bold Forecasts: A Cognitive Perspective on Risk Taking. Management science, 39(1), 17-31. doi:10.1287/mnsc.39.1.17
4. Flyvbjerg, B., Holm, M., & Buhl, S. (2005). How (In)accurate Are Demand Forecasts in Public Works Projects?: The Case of Transportation. Journal of the American Planning Association, 71, 131-146.
5. Zangeneh, P., & McCabe, B. (2022). Modelling socio-technical risks of industrial megaprojects using Bayesian Networks and reference classes. Resources Policy, 79, 103071.
6. Hubbard, D. W. (2014). How to measure anything: Finding the value of intangibles in business. John Wiley & Sons.
7. Arc Energy Podcast. https://www.arcenergyinstitute.com/section/podcasts/
8. Kahneman, D., & Klein, G. (2009). Conditions for intuitive expertise: a failure to disagree. American psychologist, 64(6), 515.