When the Model Is Right and the Decision Is Still Wrong
4/12/20265 min read
Why executives keep making slow, political decisions even when the data is good.
You did not get to the C-suite by accident. You have navigated regulatory shifts, market volatility, aging infrastructure, and a workforce in transition. You have seen risk frameworks come and go. You have sat through countless scenario planning sessions and walked away with a thick deck and a thin sense of clarity.
So here is a direct question: When was the last time your risk process actually changed a major decision?
Not informed it. Not validated it. Changed it.
For most energy executives, the honest answer is uncomfortable. And that discomfort is exactly where the opportunity lives.
The Problem Is Not Risk Management. It Is Decision Architecture.
I started my career as a chemical engineer. In process engineering, you learn quickly that the most dangerous failures are not the ones you can see. They are the ones hiding in the connections between systems, the places where pressure builds silently until something gives.
Leadership organizations fail the same way. Not in the obvious places. In the connections.
The greatest impediment to your strategic velocity is not competition. It is the ambiguity you tolerate in your internal decision making process.
When I work with executives, they almost never come to me with a risk problem. They come with what looks like a project delivery problem, a team alignment problem, or a strategy execution problem. What we find, consistently, is something deeper: a values misalignment that has quietly rewired how the organization makes decisions.
The risk framework is fine. The models are running. The dashboards are green. And yet decisions are slow, political, and disconnected from what actually matters. That is not a data problem. That is a decision architecture problem.
Most advisors treat risk, change, and delivery as separate disciplines. The problems that bring executives to me live exactly where those three things collide. That is also where the fixes are.
Three Things That Are Always True When Decisions Go Wrong
These are not theoretical. They show up in every organization I work with, in some combination.
Your risk picture only shows what you agreed to look at.
Every decision is made inside a partial, constructed version of reality. You never have the full picture. What you have is the slice of reality that your organization has decided to pay attention to.
In energy, aerospace, critical infrastructure and tech, this means your risk picture is only as good as what your teams chose to measure, report, and escalate. The rest (the slow-moving, hard-to-quantify, politically inconvenient risks) live outside the frame. Not because they do not exist. Because your situation was built to exclude them.
The question is not whether your risk register is complete. It never is. The question is whether you are honest about what is missing from it.
The connections between your risks matter more than the risks themselves.
A supply chain disruption that triggers a regulatory delay that coincides with a leadership transition is not three separate risks. It is one composed event, and its probability and impact are entirely different from any of its parts.
Most organizations manage risks in silos (operational, technical, financial, regulatory, reputational). The connections between silos are where crises are born. And those connections are almost never mapped, because the teams responsible for each silo rarely build their risk picture together.
When a crisis hits and people ask how no one saw it coming, the answer is almost always the same: everyone saw their piece. Nobody was looking at the whole.
Your baseline assumptions may reflect a world that no longer exists.
The beliefs baked into your risk models and your organizational culture came from somewhere, your industry's history, your organization's past, your own career experience. Those beliefs shape every risk judgment you make, usually without you knowing it.
When the energy transition accelerates faster than your models expected, when a cyber event hits infrastructure you assumed was isolated, when a geopolitical shift rewrites your supply assumptions overnight — the question is not whether your data is current. It is whether your beliefs have updated. Rational decisions based on outdated assumptions are still wrong decisions.
Mathematical confidence is not the same as accuracy. A model that starts with outdated assumptions will update rationally and arrive at a confidently wrong conclusion.
The AI Layer Makes This More Urgent, Not Less
Energy, Aerospace, Tech, Critical Infrastructure organizations are rapidly integrating AI into risk monitoring, predictive maintenance, grid management, and scenario modeling. That is not a problem. The problem is what gets lost in translation.
AI systems learn from historical data. In an industry undergoing structural transformation (energy transition, decentralization, electrification, new regulatory regimes) the past is an increasingly unreliable guide to the future. An AI model trained on a stable, fossil-fuel-dominated grid does not know how to reason about a grid where 40% of supply is intermittent and decentralized.
More importantly: AI systems inherit the biases of the humans who built them, labeled the data, and defined what counts as a risk. When those outputs are presented as objective (as numbers on a dashboard rather than as human judgments in algorithmic form) executives stop questioning them. The model becomes the situation. And the situation is wrong.
This is not an argument against AI in risk management. It is an argument for executives who understand what they are looking at when the model speaks. That understanding does not come from the model. It comes from the leader.
Three Questions Worth Bringing Into Your Next Leadership Meeting
You do not need a new framework. You need better questions.
What is not in our risk picture right now — and who decided to leave it out?
Every risk register is also a list of things that were excluded. Someone decided those things were not worth tracking. That decision reflects values, politics, and perception. It is worth examining explicitly.
Where are our silos hiding our biggest cross-functional exposures?
The most dangerous risks in energy organizations live at the boundaries, between operations and finance, between project delivery and regulatory affairs, between technology and workforce. When did you last require those teams to build their risk picture together?
Are we updating fast enough?
Not updating your risk models. Updating your beliefs. When new information arrives (a grid event, a regulatory signal, a competitor move) how quickly does it change what you actually do? If the answer is slowly, the bottleneck is almost certainly decision architecture, not data.
Risk management is not the destination. It is the infrastructure for better decisions. And better decisions are the only competitive advantage that compounds.
What This Looks Like in Practice
The leaders I work with who navigate complexity most effectively share a few things in common. They treat their risk process as a living decision tool, not an annual compliance exercise. They require cross-functional risk mapping — not just reporting. They actively interrogate their own assumptions, especially when their models are most confident. And they understand that their values, made explicit and aligned across the leadership team, are the foundation everything else is built on.
That last point tends to surprise people. Values sound soft. In my experience, misaligned values are the hardest and most expensive organizational risk there is. They show up as slow decisions, political noise, missed signals, and transformation initiatives that never quite stick.
A chemical engineer learns early that systems fail at their weakest connection. In leadership organizations, the weakest connection is almost always human. Not because people are incompetent. Because the architecture was never designed to make their judgment visible, aligned, and fast.
That is fixable. And fixing it changes everything downstream.
If this reflects something you are sitting with, I am happy to think it through with you.
Connect
Empowering leaders to master risk and change.
Inspire
Impact
info@masteryprofessionals.com
© 2025. All rights reserved.
