Share
december 27, 2025

The unseen whole

This article is part of a broader line of work on systems, structure, and decision-making developed at Inratios.

The unseen whole

Every system aims to improve outcomes.
Organizations want change to work.
Teams want effort to translate into results.

Yet many initiatives fail not because the goal was wrong, but because action started before the situation was properly understood.
Seeing the broader picture before acting is not about slowing progress.
It is about aligning action with reality.
When systems act without a shared and accurate understanding of the situation, effort turns into activity without impact. Outcomes suffer, not because people worked too little, but because they worked on the wrong problem.


Action without context creates superficial change
Across domains, the same pattern appears.

In healthcare, clinical reasoning exists for a reason. Treatment starts only after the situation is understood well enough to define a meaningful goal. Acting on symptoms without understanding the underlying problem leads to ineffective or even harmful outcomes.

In management, the mistake is similar. Solutions are often proposed before the problem is clearly defined. Meetings produce action plans for issues that were never properly framed. The result is activity without direction.

In change management, this becomes even more visible. Organizations respond to triggers. A KPI drops. Engagement declines. A process slows down. A solution is introduced quickly. A new structure. A new tool. A new policy.

But the real question often remains unasked:
What problem are we actually trying to solve?
Without a broader picture, change addresses surface effects instead of structural constraints. The bottleneck stays in place. The struggle simply shifts elsewhere.


Seeing before doing defines the right goal
Seeing the broader picture is not about collecting endless information. It is about understanding the situation well enough to define the right objective.

  • In healthcare, this is called problem representation.
  • In research, it is problem framing.
  • In Lean, it is grasping the situation.
  • In system design, it means identifying constraints and defining a viable change path.

Different language. Same discipline. Goals that are set without this foundation are fragile. They solve the wrong problem efficiently.


Triggers are not problems
Many actions are responses to triggers, not to problems.

  • A trigger creates urgency.
  • A problem requires understanding.

When systems confuse the two, they optimize locally and fail globally. The appearance of progress replaces actual improvement. Seeing before doing changes that dynamic. It forces systems to slow down at the only moment where slowing down actually saves time later.


The moment where things usually go wrong
There is a recognizable moment in almost every complex situation.

  • Information is incomplete.
  • Pressure is rising.
  • Someone asks the inevitable question: “What are we going to do?”

At that point, action feels like relief. Doing something creates movement. Movement creates the feeling of control. Waiting feels risky. Looking feels passive.
So we move.
And that is often where the real mistake is made.


Action is not neutral in complex systems
In simple systems, acting quickly is often fine.
In complex systems, it is not.
Every intervention changes the system you are trying to influence. If your understanding of the situation is incomplete or distorted, action does not reduce uncertainty. It multiplies it.

  • In crisis response, a wrong early intervention can escalate the situation.
  • In organizations, quick fixes create recurring problems.
  • In markets, action without context increases volatility instead of reducing risk.

This is why experienced professionals are often cautious at the start. Not because they hesitate. But because they know that the first move reshapes the entire situation.


Seeing is not passive. It is work.
In crisis management, most effort goes into sensemaking before decisions are made.
In Lean, teams are trained to grasp the situation before jumping to countermeasures.
In research, weak problem framing leads to years of wasted effort.
In medicine, acting before the problem is properly represented is considered a cognitive error.

Different domains. Same logic.

  • Seeing is not waiting.
  • Seeing is active work.

It means gathering information deliberately.
It means structuring what matters and what does not.
It means resisting the urge to solve the first problem that presents itself.
This phase often takes more time than people expect.And that is exactly why it gets skipped.


Why this feels uncomfortable
Most environments reward visible action.
Meetings produce decisions.
Dashboards produce metrics.
Markets reward movement.
Careful observation does not look impressive from the outside. It does not generate quick wins. It does not satisfy the urge to intervene.

Yet in complex systems, the cost of acting on a wrong picture is far higher than the cost of waiting a little longer to understand what is actually happening. By the time the consequences appear, the original mistake is hard to trace. The system is already reacting to the intervention.


A pattern that repeats everywhere
Once you start looking for it, the pattern is hard to ignore.
Misdiagnosis in healthcare often traces back to premature closure.
Escalating crises often start with poor sensemaking.
Organizational change fails because the real problem was never framed correctly.
Trading losses pile up because action replaced context.
The failure is rarely a lack of intelligence or effort. It is a failure to form a reliable picture of reality before acting.


Seeing before doing
Seeing before doing is not a slogan.
It is a discipline.
It treats understanding the situation as the first intervention, not a preparatory step. It assumes that better execution cannot compensate for poor framing. And it accepts that in complex systems, not acting yet can be the most responsible move.
This shift changes how systems behave.
It changes how decisions are made.
And it changes what tools are actually useful.


What follows from here
Everything I work on starts with this principle.

  • Not with prediction.
  • Not with optimization.
  • Not with faster decisions.

But with better seeing.

In the next articles, I will unpack how this discipline shows up in crisis management, Lean thinking, research, medicine, and system design. And why most tools fail precisely because they skip this step.

For now, this is the core idea:
Most systems do not fail because they act too slowly.
They fail because they act before they have truly seen the problem.


“A familiar story
There is an old story about a group of blind men asked to describe an elephant.

One touches the leg and says it is a pillar.
Another feels the trunk and calls it a snake.
A third holds the ear and insists it is a fan.
Each description is correct.
And each is incomplete.

They are not wrong because they lack intelligence.
They are wrong because none of them sees the whole.

The elephant does not change.
Only their understanding does.”