
Hantu in the Machine: The Cyber-Sak Yant & The Soulbound Token
Why some assets, like sacred tattoos, can never be transferred or sold.

Hantu in the Machine: The Bomoh & The Oracle
How do blind computer networks know the weather or who won the World Cup? They need a medium.

Same Same but Different 4-6
An explainer content series to simplify blockchain concepts that even a 10 year-old could understand.
<100 subscribers

Hantu in the Machine: The Cyber-Sak Yant & The Soulbound Token
Why some assets, like sacred tattoos, can never be transferred or sold.

Hantu in the Machine: The Bomoh & The Oracle
How do blind computer networks know the weather or who won the World Cup? They need a medium.

Same Same but Different 4-6
An explainer content series to simplify blockchain concepts that even a 10 year-old could understand.


The origin story of the Vietnamese people begins with a biological impossibility that became a political masterstroke.
Lạc Long Quân, the Dragon Lord of the Sea, married Âu Cơ, the Immortal Fairy of the Mountains. From their union, Âu Cơ did not give birth to a child. Instead, she gave birth to a sac containing 100 eggs.
From these 100 eggs hatched the ancestors of the Vietnamese people, also known as Bách Việt. They were strong, uniform, and perfect. They resembled and functioned like a singular unit.
But soon, the parents realised there was a fundamental problem. The Dragon Lord was of the water; the Fairy was of the high earth. Their natures were opposed. "I am of the dragon race; you are of the fairy race," Lạc Long Quân said. "We cannot live together forever. We must divide to rule."
So they executed a great split. 50 children followed the father to the sea to rule the coastal plains, while the other 50 children followed the mother to the mountains to rule the highlands.
They governed different domains, optimised for different environments, but they swore a pact: "If there is trouble, we will call, and we will help each other."
In 2026, computer scientists can help us realise that Lạc Long Quân was the first systems architect. He understood that to scale a civilisation (or an intelligence), you cannot keep everything in one sac. You must embrace Mixture of Experts (MoE) and Swarm Intelligence.

For the last few years, we have been obsessed with building the one giant brain, much like the early versions of GPT. The ideal was having one model that could write poetry, code in Python, diagnose cancer, and tell jokes. We wanted the sac of 100 eggs to stay as one.
But experience and observation have proven this to be inefficient. A brain designed to write poetry with a creative and fluid mind like water is often bad at executing rigid logic, which needs to be structured and solid like a mountain. When you force one model to do everything, it gets bloated, slow, and expensive. It suffers from severe drifting, starts hallucinating, and sometimes even forgets what it's supposed to be doing.
So, the leading AI labs like OpenAI and DeepSeek switched strategies. They adopted an architecture called Mixture of Experts (MoE). Instead of one giant neural network, they split the intelligence into individual 'experts.'
The Sea Agents: When you ask the AI to write a poem or paint a picture, the request is routed to the creative experts, the children of the dragon.
The Mountain Agents: When you ask the AI to solve a math proof or debug code, the request is routed to the logic experts, the children of the fairy.
Just like the legend, the system works better when the children are separated and allowed to specialise in their own domains. The 'mountain child' doesn't try to swim, and the 'sea child' doesn't try to climb.

We are now moving even further, from MoE models to autonomous agent swarms.
Imagine a future software company where it won't be one 'God AI' running the show. Instead, it will be a lineage of 100 specialised agents, all hatched from the same foundational egg but deployed to different environments. You may already be using a basic form of this approach when you create 'projects' on ChatGPT or 'Gems' on Gemini.
The 50 in the Sea (The Flow): These agents live in the liquid world of the internet. They monitor stock prices, track social media trends, and handle logistics. They are fast, fluid, and reactive.
The 50 in the Mountains (The Structure): These agents live in the solid world of your local servers. They manage the database, secure the firewall, and optimise the hardware. They are slow, sturdy, and defensive.
The most beautiful part of the legend is the pact. Even though they separated, they remained loyal to each other.
"If the sea is attacked, the mountains will descend to fight. If the mountains starve, the sea will bring fish."
This is the biggest challenge in AI right now: interoperability.
We may already have great 'mountain' agents like law bots and great 'sea' agents like trading bots. But they often speak different languages, like estranged siblings.
For this Vietnamese analogy of AI to work, we will need a common protocol—a shared language that allows these specialised agents to coordinate. We need the mountain to trust the sea.

Lạc Long Quân and Âu Cơ understood that a monoculture is weak. If all 100 children stayed in the mountains, they would have no fish, and if all 100 went to the sea, they would have no timber.
By splitting the swarm, they covered all terrain and built a resilient nation that could survive both floods and droughts.
As we build our digital intelligence frameworks, we should stop looking for the one 'God bot' to save us. Instead, we should look for the sac of eggs and hatch a hundred specialised, smaller intelligences, send them to the edges of the network, and teach them the most important lesson of the legend:
Call your family when you need help.
The origin story of the Vietnamese people begins with a biological impossibility that became a political masterstroke.
Lạc Long Quân, the Dragon Lord of the Sea, married Âu Cơ, the Immortal Fairy of the Mountains. From their union, Âu Cơ did not give birth to a child. Instead, she gave birth to a sac containing 100 eggs.
From these 100 eggs hatched the ancestors of the Vietnamese people, also known as Bách Việt. They were strong, uniform, and perfect. They resembled and functioned like a singular unit.
But soon, the parents realised there was a fundamental problem. The Dragon Lord was of the water; the Fairy was of the high earth. Their natures were opposed. "I am of the dragon race; you are of the fairy race," Lạc Long Quân said. "We cannot live together forever. We must divide to rule."
So they executed a great split. 50 children followed the father to the sea to rule the coastal plains, while the other 50 children followed the mother to the mountains to rule the highlands.
They governed different domains, optimised for different environments, but they swore a pact: "If there is trouble, we will call, and we will help each other."
In 2026, computer scientists can help us realise that Lạc Long Quân was the first systems architect. He understood that to scale a civilisation (or an intelligence), you cannot keep everything in one sac. You must embrace Mixture of Experts (MoE) and Swarm Intelligence.

For the last few years, we have been obsessed with building the one giant brain, much like the early versions of GPT. The ideal was having one model that could write poetry, code in Python, diagnose cancer, and tell jokes. We wanted the sac of 100 eggs to stay as one.
But experience and observation have proven this to be inefficient. A brain designed to write poetry with a creative and fluid mind like water is often bad at executing rigid logic, which needs to be structured and solid like a mountain. When you force one model to do everything, it gets bloated, slow, and expensive. It suffers from severe drifting, starts hallucinating, and sometimes even forgets what it's supposed to be doing.
So, the leading AI labs like OpenAI and DeepSeek switched strategies. They adopted an architecture called Mixture of Experts (MoE). Instead of one giant neural network, they split the intelligence into individual 'experts.'
The Sea Agents: When you ask the AI to write a poem or paint a picture, the request is routed to the creative experts, the children of the dragon.
The Mountain Agents: When you ask the AI to solve a math proof or debug code, the request is routed to the logic experts, the children of the fairy.
Just like the legend, the system works better when the children are separated and allowed to specialise in their own domains. The 'mountain child' doesn't try to swim, and the 'sea child' doesn't try to climb.

We are now moving even further, from MoE models to autonomous agent swarms.
Imagine a future software company where it won't be one 'God AI' running the show. Instead, it will be a lineage of 100 specialised agents, all hatched from the same foundational egg but deployed to different environments. You may already be using a basic form of this approach when you create 'projects' on ChatGPT or 'Gems' on Gemini.
The 50 in the Sea (The Flow): These agents live in the liquid world of the internet. They monitor stock prices, track social media trends, and handle logistics. They are fast, fluid, and reactive.
The 50 in the Mountains (The Structure): These agents live in the solid world of your local servers. They manage the database, secure the firewall, and optimise the hardware. They are slow, sturdy, and defensive.
The most beautiful part of the legend is the pact. Even though they separated, they remained loyal to each other.
"If the sea is attacked, the mountains will descend to fight. If the mountains starve, the sea will bring fish."
This is the biggest challenge in AI right now: interoperability.
We may already have great 'mountain' agents like law bots and great 'sea' agents like trading bots. But they often speak different languages, like estranged siblings.
For this Vietnamese analogy of AI to work, we will need a common protocol—a shared language that allows these specialised agents to coordinate. We need the mountain to trust the sea.

Lạc Long Quân and Âu Cơ understood that a monoculture is weak. If all 100 children stayed in the mountains, they would have no fish, and if all 100 went to the sea, they would have no timber.
By splitting the swarm, they covered all terrain and built a resilient nation that could survive both floods and droughts.
As we build our digital intelligence frameworks, we should stop looking for the one 'God bot' to save us. Instead, we should look for the sac of eggs and hatch a hundred specialised, smaller intelligences, send them to the edges of the network, and teach them the most important lesson of the legend:
Call your family when you need help.
Share Dialog
Share Dialog
No comments yet