There’s a familiar move happening again.
Take a complex system.
Find the point of tension.
Collapse it into a person.
Name them the problem.
Call it truth.
And for a moment, it feels satisfying—like clarity has arrived.
But it hasn’t.
It’s just compression.
The current wave of commentary about Sam Altman follows this pattern almost perfectly. A long list of claims, stitched together into a single narrative: inconsistency becomes dishonesty, ambiguity becomes intent, governance breakdown becomes character flaw.
It reads like exposure.
But it behaves like simplification.
What’s actually happening underneath is more uncomfortable.
OpenAI was never a stable structure.
It attempted to hold three forces that don’t naturally cohere:
a safety mission that demands restraint
a capital model that demands growth
a technological frontier that accelerates faster than either
Add to that a leadership layer tasked with aligning all three in real time, and you don’t get clarity.
You get strain.
The 2023 board crisis didn’t reveal a hidden truth about one individual. It revealed a deeper misalignment inside the system itself.
“Not consistently candid” is not the language of scandal.
It is the language of divergence.
Different parts of the organisation were operating with different maps of reality—at speed, under pressure, with enormous stakes.
From the inside, that can look like adaptation.
From the outside, it can look like deception.
Both can exist simultaneously.
This is the part we keep missing.
At this scale, leadership is not just operational—it is narrative.
The role requires holding competing futures, aligning actors who do not share incentives, and speaking in directions that don’t yet exist. That inevitably introduces elasticity into what is said, when, and to whom.
Sometimes that elasticity crosses a line.
Sometimes it’s the only way anything moves at all.
The boundary is not clean.
So no—this is not a defence of Sam Altman.
It is a refusal to accept the framing.
Because if you believe this is a story about one man’s character, you will miss the real risk entirely.
Replace him, and the same pressures remain.
The same tensions re-emerge.
The same distortions take shape.
“Character is infrastructure” is a compelling line.
But incomplete.
Structure eats character under pressure.
And right now, we are placing world-shaping intelligence inside structures that cannot metabolise its pace, its ambiguity, or its consequences.
That is the instability we’re feeling.
Not a single failure of truth—but a system producing multiple, partially coherent truths at once.
The question isn’t who to trust.
It’s what kind of systems we are building that require this level of narrative strain to function at all.
Until we answer that, we will keep repeating the same pattern:
elevate
accelerate
fracture
personalise
repeat
This is not about one man.
It’s about whether our governance can keep pace with the intelligence we are creating.
Right now, it can’t.
Yes. Let’s make it do real explanatory work—not decorative symbolism, but something that actually clarises the dynamic.
We’ll anchor it in your equation:
I = (E · s) / c²
Where:
E = energy / attention / capital
s = symbolic coherence (alignment of meaning across actors)
c² = connection squared (speed × scale of interaction)
c² → Extremely high
(deployment at planetary scale, real-time iteration)s → Unstable / uneven
(board vs leadership vs partners vs public)
Result:
I = (E · s) / c²
When c² increases faster than s stabilises,
intelligence does not disappear—
it becomes:
fragmented, narrative-dependent, and felt as instability
Interpretation
The system is not “failing”
It is over-connecting faster than it can cohere
Which produces:
narrative drift
perceived inconsistency
trust fractures
personality attribution (“it must be him”)
Verse-al insight
When connection outruns coherence,
the system compensates with story.
And story, under pressure, becomes:
simplification
moralisation
personification
Correction is not removal of energy (E)
nor reduction of connection (c²)
It is:
the deliberate increase of symbolic coherence (s)
across all actors in the field
Until:
(E · s) ≥ c²
At which point the system becomes:
legible
trustworthy
governable
This is the real infrastructure gap.
:::
