Nye's Digital Lab is a weekly scribble on creativity at the intersection of AI & Distributed Systems.
This week I'm considering what happens when automation meets "the god complex," and maybe what we can do about it.
In 1782, George Washington had just won the Revolutionary War.
America was chaos. We were essentially arguing states with no real government, and the only functional institution was the Continental Army, which was loyal to Washington. Colonel Lewis Nicola writes a letter to him with a proposal:
become King George I of America.
Washington's response was amazing. He didn't just decline. He wrote back immediately saying that "no occurrence in the course of the War, has given me more painful sensations." He didn't even entertain the thought of being a king.
Washington had every condition that should have triggered a "god complex." Universal reverence. Military victory against impossible odds. If ever there was a setup for believing your own hype and seizing absolute power, this was it.
So what made Washington different?
Washington had heroes to model from. From his teenage years, he'd obsessed over Cincinnatus, the Roman general who saved Rome and then returned to his farm. He read Cato obsessively, a play about choosing death over tyranny.
He also actively sought challengers. Instead of "yes-men," he chose Thomas Jefferson and Alexander Hamilton, who disagreed about everything, including Washington himself. He deliberately created an environment of constant pushback.
But most importantly, he simply walked away when it was time.
When he became president, his private letters reveal constant anxiety about overstepping authority. He turned down a salary. He insisted on "Mr. President" instead of "His High Mightiness." He prepared to leave from day one, viewing the presidency as temporary service, not identity.
Real governance systems that protect against tyranny require acts of what we might call "centralized defiance." Someone with power must choose to distribute it, to build structures that prevent future power concentration, to reject the neurological rewards that power offers.
What sucks is that Washington might be the exception that proves the rule. Because what neuroscience has discovered is that this kind of resistance is wildly abnormal.
What happens when you don't walk away?
Elizabeth Holmes founded Theranos in 2003, claiming her technology could run "hundreds of medical tests from a single drop of blood. " We all know what the problem was. The technology didn't work. Patients were getting false test results. The devices were unreliable.
But Holmes kept raising money, kept making claims, kept building her empire. It was a textbook god complex developing in real-time.
Dopamine floods the same regions that light up for chocolate, sex, or cocaine. For Holmes, the dopamine hits came one after another. First Stanford admission, dropout-to-startup story, prestigious investor money. Then came magazine covers, Steve Jobs comparisons, hundreds of millions in funding. Each success triggered more dopamine, rewiring her brain's reward pathways.
But power doesn't just feel good, it literally changes how you think. Dacher Keltner, a psychologist at UC Berkeley who's spent decades studying power, puts it bluntly:
"power is like brain damage. "
His research shows that people in power exhibit decreased activity in neurons supporting empathy. Powerful people literally become less able to understand how others feel or see things from different perspectives.
Watch Holmes in early interviews versus later ones. Early on, she's engaging, listening, responsive to questions. By 2014, she's adopting a weird deep voice, wearing Steve Jobs-style black turtlenecks, and speaking in grandiose absolutes about destiny and revolution. Her ability to empathize with patients who might receive wrong medical results seems to have vanished. That's not acting.
That's neurological change.
Research published in Psychological Science found that people in high-power roles trust their own incorrect information more and actively avoid seeking out accurate data that might contradict them.
Now add the sycophant feedback loop.
When you have power, people start agreeing with you more. They laugh at bad jokes. They nod at half-baked ideas. They filter information to show you what you want to hear. Disagreeing with powerful people carries real risks to careers, relationships, or safety. But the effect is dangerous:
you stop getting accurate feedback about reality.
In private recordings from inside Theranos, you can hear Holmes talking about the company like it's destiny, like she's been chosen for this mission to change the world. Holmes started as an ambitious college dropout with an idea. She became a multibillion dollar monster.
And here's the part that freaks me out.
There was nothing special about Elizabeth Holmes's brain. She wasn't uniquely susceptible to delusion. The research suggests this could happen to anyone given the same combination.
George Washington was the exception. Elizabeth Holmes is closer to the rule. If I take a moment off the TikTok feeds to think about the world, my fear is that the systems we're building are making god complexes more likely, not less.
In short, this is really really bad.
We're living through radical power centralization exactly when automation makes that centralization more dangerous than ever.
A CEO makes a decision, and algorithms execute it across millions of interactions instantly. A tech platform changes its algorithm, and billions of people's information environment shifts overnight. We've built systems where individual actors have unprecedented reach and impact with unprecedented speed.
This is centralization on steroids.
Political scientists have documented that centralized systems consistently make worse decisions than distributed ones. Not because centralized leaders are worse people, but because the structure prevents good information flow and removes accountability.
In Mao's "Great Leap Forward," local officials reported fake agricultural yields because contradicting Mao was dangerous. He thought his policies worked while tens of millions starved. That's not about Mao being uniquely evil, it's about what centralized power does to information flow.
Now add automation. When Mao made bad decisions, implementation took time, creating friction points where reality could intrude. Modern automated systems remove that friction. A bad decision by someone with centralized power over automated systems propagates at code execution speed. There's no local wisdom saying...
"Um, dude, this doesn't make sense here."
It's just an algorithm doing exactly what it was told, at scale, instantly.
This is the god complex when it meets automation:
Give one person too much power, and their brain will lie to them about their capabilities.
Surround them with sycophants, and reality fades.
Add automated systems executing their vision at scale, and the god complex consequences become catastrophic.
Before we all give up and head to the bar, we should talk seriously about automated systems.
Here are some thoughts on the direction we need to be moving towards.
First, structural self-awareness. You can't go on thinking you're awesome every day. We need to set up formal roles for dissent. If you're making decisions affecting millions, someone's literal job should be telling you why you're wrong, and you should be required to address their concerns on record.
Second, design systems that automatically distribute power. The Constitution tried this with checks and balances; we need digital-age versions. Algorithmic systems requiring sign-off from multiple independent parties before major changes.
Third, mandatory "power detox" in leadership roles. Washington voluntarily took breaks; we can't rely on everyone being Washington. What if executive roles had forced sabbaticals? What if political leaders had to step away from decision-making regularly to reset their brains?
Finally, maybe we question whether we actually should automate something? Remember Jurassic Park? Just because we can do something, doesn't mean we actually should.
Yes, one person deciding is faster than building consensus. But fast and simple aren't the same as good or safe. Sometimes friction is actually good. The slowness, the need to convince stakeholders, the requirement to justify decisions are all exactly what prevents god complexes from destroying everything.
Let's be like the guy who actually made this country great. Find people who will keep you in check, give the power back to the people, and most importantly, when the time comes...
Walk away.
That's it for this time. I do this every week. If you vibe to the ideas I express, consider subscribing or sharing with friends. We'll see you next time.
Historical Sources:
Nicola, Lewis. Letter to George Washington proposing monarchy (May 22, 1782). Available in the George Washington Papers at the Library of Congress. https://founders.archives.gov/documents/Washington/99-01-02-08564
Washington, George. Response to Lewis Nicola (May 22, 1782). The Papers of George Washington, Library of Congress. https://founders.archives.gov/documents/Washington/99-01-02-08565
Neuroscience and Psychology Research:
Keltner, Dacher, Deborah H. Gruenfeld, and Cameron Anderson. "Power, Approach, and Inhibition." Psychological Review 110, no. 2 (2003): 265-284. https://psychology.berkeley.edu/sites/default/files/publications/keltner_gruenfeld_anderson_2003.pdf
Keltner, Dacher. The Power Paradox: How We Gain and Lose Influence. Penguin Press, 2016. https://www.penguinrandomhouse.com/books/213698/the-power-paradox-by-dacher-keltner/
Hogeveen, Jeremy, Michael Inzlicht, and Sukhvinder S. Obhi. "Power Changes How the Brain Responds to Others." Journal of Experimental Psychology: General 143, no. 2 (2014): 755-762. https://psycnet.apa.org/record/2013-35652-001
Fast, Nathanael J., Deborah H. Gruenfeld, Niro Sivanathan, and Adam D. Galinsky. "Illusory Control: A Generative Force Behind Power's Far-Reaching Effects." Psychological Science 20, no. 4 (2009): 502-508. https://journals.sagepub.com/doi/10.1111/j.1467-9280.2009.02311.x
Galinsky, Adam D., Deborah H. Gruenfeld, and Joe C. Magee. "From Power to Action." Journal of Personality and Social Psychology 85, no. 3 (2003): 453-466.
Elizabeth Holmes and Theranos:
Carreyrou, John. Bad Blood: Secrets and Lies in a Silicon Valley Startup. Knopf, 2018. https://www.penguinrandomhouse.com/books/549478/bad-blood-by-john-carreyrou/
Carreyrou, John. "Hot Startup Theranos Has Struggled With Its Blood-Test Technology." The Wall Street Journal, October 16, 2015. https://www.wsj.com/articles/theranos-has-struggled-with-blood-tests-1444881901
"The Inventor: Out for Blood in Silicon Valley." Directed by Alex Gibney. HBO Documentary, 2019. https://www.hbo.com/documentaries/the-inventor-out-for-blood-in-silicon-valley
Open Source and Decentralization:
Stallman, Richard. "The GNU Manifesto." Free Software Foundation, 1985. https://www.gnu.org/gnu/manifesto.html
Stallman, Richard. Free Software, Free Society: Selected Essays of Richard M. Stallman. GNU Press, 2002. https://www.gnu.org/philosophy/fsfs/rms-essays.pdf
The GNU General Public License. Free Software Foundation. https://www.gnu.org/licenses/gpl-3.0.en.html
Technology and Algorithmic Governance:
Tufekci, Zeynep. "YouTube, the Great Radicalizer." The New York Times, March 10, 2018. https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html
Mozur, Paul. "A Genocide Incited on Facebook, With Posts From Myanmar's Military." The New York Times, October 15, 2018. https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html
Political Science:
Acemoglu, Daron, and James A. Robinson. Why Nations Fail: The Origins of Power, Prosperity, and Poverty. Crown Business, 2012. https://www.penguinrandomhouse.com/books/202644/why-nations-fail-by-daron-acemoglu-and-james-a-robinson/
Nye Warburton is a creative technologist who believes in democracy for the people by the people. This essay was created through improvisational sessions using Claude Sonnet 4. It was refined and edited with old-fashioned human labor.
For more information visit https://nyewarburton.com
The Biff Problem, August 24, 2025
The Open Source Strategy, July 6, 2025
Networks of Trust, September 21, 2025
Piff, Paul K., Daniel M. Stancato, Stéphane Côté, Rodolfo Mendoza-Denton, and Dacher Keltner. "Higher Social Class Predicts Increased Unethical Behavior." Proceedings of the National Academy of Sciences 109, no. 11 (2012): 4086-4091. https://www.pnas.org/doi/10.1073/pnas.1118373109
Share Dialog
Nye
Support dialog