The US Bitcoin Reserve: The Digital Bretton Woods is Coming
Imagine a world where the US government quietly amasses the largest Bitcoin reserve on the planet, using it to cement the dominance of the US Dollar for another century. Sound like science fiction? It’s not. In 1944, the US pulled off a similar feat with gold, convincing its allies to store their reserves in Fort Knox and laying the foundation for decades of financial supremacy. Now, history is poised to repeat itself—this time with Bitcoin.

One of the World’s Most Broken Systems
Betting on Science is still a ... bet...
Visualizing the future.


The US Bitcoin Reserve: The Digital Bretton Woods is Coming
Imagine a world where the US government quietly amasses the largest Bitcoin reserve on the planet, using it to cement the dominance of the US Dollar for another century. Sound like science fiction? It’s not. In 1944, the US pulled off a similar feat with gold, convincing its allies to store their reserves in Fort Knox and laying the foundation for decades of financial supremacy. Now, history is poised to repeat itself—this time with Bitcoin.

One of the World’s Most Broken Systems
Betting on Science is still a ... bet...
Share Dialog
Share Dialog
Visualizing the future.

Subscribe to Sandy Peng

Subscribe to Sandy Peng
<100 subscribers
<100 subscribers
DeepSeek burst onto the popular news scene this week, overtaking Chat-GPT as the top-rated free application on the Apple App Store.
The internet was simultaneously set on fire with debates around the US/ China AI chip war, open source versus closed source innovation, the two companies' opposite stance on AGI, and a string of conspiracy theories around whether Deepseek had really managed to build a world-class LLM model on a shoestring budget.

DeepSeek was built by a subsidiary of Huanfang, a Chinese quant trading firm. This seems like an odd origin story on the surface level, but when you deep dive into how these trading firms operate, it isn’t surprising they’ve made this shift.
Trading firms possess exactly what you need to push the boundaries of AI: exceptional technical talent and massive computing resources.
The most capable graduates from top Chinese universities like Tsinghua and Peking University typically follow one of two paths: they either join quant firms or head to the US for PhDs. The Quant firms are often the preferred path for top-notch graduates. Just like quant hedge funds in the US, these jobs pay astronomically well and therefore can afford to be incredibly selective.
Many Huanfang hires competed in the International Olympiad in Informatics or International Mathematical Olympiad. They have access to some of the best raw human intelligence in the world, already working within their walls.
But raw human intelligence isn’t enough to just pop up and usurp Chat-GPT overnight. These firms have been at the forefront of building deep expertise in systems engineering, constantly optimizing for microsecond advantages in trading. They've developed both algorithmic sophistication and serious systems knowledge, refined over the past decade (not just the past couple of years).
Then there's the hardware angle. While everyone else is scrambling for GPUs now, trading firms like Huanfang were stockpiling them years ago for their trading strategies. Some were even exploring building their own TPU-like hardware accelerators. This trend precedes the current meta of seeing AI under the geopolitical lens. When the large language model revolution hit, they already had the computational infrastructure in place — they just needed to shift their focus.

The thing that drew me in the most about DeepSeek is their decision to open source their work. In an industry where money = GPUs = better models = more profit, this represents a significant commitment to advancing the field and will be especially important as we move closer to AGI. Having high-quality open source models widely available creates opportunities for innovation that just wouldn't exist in a purely closed ecosystem, and as Marc Andreson put it, “is a gift for the human race”.
It opens the path for anyone with any amount of resources, to use DeepSeek’s stack to build their own advanced models. Bitcoin mining farms could shift some of their resources to a subsidiary AI team. Tech start-ups could build on top of DeepSeek's foundation instead of starting from scratch. Even academic labs with sufficient compute could start pushing the boundaries of AI. The race that previously belonged to 5 pro-profit entities, just opened up a bit more.
I've seen this pattern before in tech — breakthroughs often come from unexpected places. EigenLayer transformed from a data availability solution into a restaking protocol where everything can be an AVS. ZK-proofs evolved from a privacy tech into scaling tech, and suddenly there’s a rollup for everything. The next big advance usually comes from a direction no one was watching.
If other resource-rich companies start seriously focusing on AI development, and especially if they follow DeepSeek's open-source approach, we could see a major acceleration in AI advancement. And if AGI is on the horizon, their lightweight open-source models could create countless opportunities for innovation.
The traditional tech giants have dominated AI development so far, but DeepSeek marks a new, open-source standard. The most interesting developments happen when different fields and capabilities intersect in novel ways — we are just beginning to see this with AI. It’s very exciting to think that the next major innovation may not come from huge conglomerates with abundant resources but from the true innovators enabled by open-source, probably around the world, because as we always say at Scroll, “talent is evenly distributed, but opportunities are not”.
DeepSeek burst onto the popular news scene this week, overtaking Chat-GPT as the top-rated free application on the Apple App Store.
The internet was simultaneously set on fire with debates around the US/ China AI chip war, open source versus closed source innovation, the two companies' opposite stance on AGI, and a string of conspiracy theories around whether Deepseek had really managed to build a world-class LLM model on a shoestring budget.

DeepSeek was built by a subsidiary of Huanfang, a Chinese quant trading firm. This seems like an odd origin story on the surface level, but when you deep dive into how these trading firms operate, it isn’t surprising they’ve made this shift.
Trading firms possess exactly what you need to push the boundaries of AI: exceptional technical talent and massive computing resources.
The most capable graduates from top Chinese universities like Tsinghua and Peking University typically follow one of two paths: they either join quant firms or head to the US for PhDs. The Quant firms are often the preferred path for top-notch graduates. Just like quant hedge funds in the US, these jobs pay astronomically well and therefore can afford to be incredibly selective.
Many Huanfang hires competed in the International Olympiad in Informatics or International Mathematical Olympiad. They have access to some of the best raw human intelligence in the world, already working within their walls.
But raw human intelligence isn’t enough to just pop up and usurp Chat-GPT overnight. These firms have been at the forefront of building deep expertise in systems engineering, constantly optimizing for microsecond advantages in trading. They've developed both algorithmic sophistication and serious systems knowledge, refined over the past decade (not just the past couple of years).
Then there's the hardware angle. While everyone else is scrambling for GPUs now, trading firms like Huanfang were stockpiling them years ago for their trading strategies. Some were even exploring building their own TPU-like hardware accelerators. This trend precedes the current meta of seeing AI under the geopolitical lens. When the large language model revolution hit, they already had the computational infrastructure in place — they just needed to shift their focus.

The thing that drew me in the most about DeepSeek is their decision to open source their work. In an industry where money = GPUs = better models = more profit, this represents a significant commitment to advancing the field and will be especially important as we move closer to AGI. Having high-quality open source models widely available creates opportunities for innovation that just wouldn't exist in a purely closed ecosystem, and as Marc Andreson put it, “is a gift for the human race”.
It opens the path for anyone with any amount of resources, to use DeepSeek’s stack to build their own advanced models. Bitcoin mining farms could shift some of their resources to a subsidiary AI team. Tech start-ups could build on top of DeepSeek's foundation instead of starting from scratch. Even academic labs with sufficient compute could start pushing the boundaries of AI. The race that previously belonged to 5 pro-profit entities, just opened up a bit more.
I've seen this pattern before in tech — breakthroughs often come from unexpected places. EigenLayer transformed from a data availability solution into a restaking protocol where everything can be an AVS. ZK-proofs evolved from a privacy tech into scaling tech, and suddenly there’s a rollup for everything. The next big advance usually comes from a direction no one was watching.
If other resource-rich companies start seriously focusing on AI development, and especially if they follow DeepSeek's open-source approach, we could see a major acceleration in AI advancement. And if AGI is on the horizon, their lightweight open-source models could create countless opportunities for innovation.
The traditional tech giants have dominated AI development so far, but DeepSeek marks a new, open-source standard. The most interesting developments happen when different fields and capabilities intersect in novel ways — we are just beginning to see this with AI. It’s very exciting to think that the next major innovation may not come from huge conglomerates with abundant resources but from the true innovators enabled by open-source, probably around the world, because as we always say at Scroll, “talent is evenly distributed, but opportunities are not”.
Sandy Peng
Sandy Peng
No activity yet