Easy ideas are always hard
Tail risk markets, being clever and stories about stablecoin scale
The Aircraft Carrier Paradox
And why good products can't simply be "copied"

Annual Letter (I guess?)
TLDR; The Capa team thanks you all for your continued support in 2025. This year was about proving that stablecoin FX and payments is no longer a theory, sharing some grounded thoughts on where payments and stablecoins are actually heading, and paving out what comes next for Capa.Dear friends and supporters of Capa (CaPanas),This first annual letter is intentionally light on flexing and heavy on thinking. Not because 2025 lacked numbers, but because it marked a shift in how stablecoins are ac...
Easy ideas are always hard
Tail risk markets, being clever and stories about stablecoin scale
The Aircraft Carrier Paradox
And why good products can't simply be "copied"

Annual Letter (I guess?)
TLDR; The Capa team thanks you all for your continued support in 2025. This year was about proving that stablecoin FX and payments is no longer a theory, sharing some grounded thoughts on where payments and stablecoins are actually heading, and paving out what comes next for Capa.Dear friends and supporters of Capa (CaPanas),This first annual letter is intentionally light on flexing and heavy on thinking. Not because 2025 lacked numbers, but because it marked a shift in how stablecoins are ac...
Subscribe to Stablecoinmente hablando
Subscribe to Stablecoinmente hablando
Share Dialog
Share Dialog


<100 subscribers
<100 subscribers
I studied sustainable development and I still think most of it is bullshit.
Not because the problems aren’t real, but because the way they’re usually framed is lazy. Too much morality, not enough incentives, and timelines. It always felt like a field that talked about outcomes without wanting to touch the systems that actually produce them. So I moved on, got interested in payments, liquidity, market structure, places where things actually break when scale shows up. AI has dragged me back kicking and screaming.
Not because I suddenly believe in the narrative, but because AI finally got big enough to collide with physical reality in public, and thermodynamics or physics don't negotiate with terrorists.
Once AI stopped being bursty and started being always on, it stopped behaving like software and started behaving like industry. Both heat and cooling became continuous instead of reactive, with water stopping to being a footnote and suddenly becoming a real constraint.
That shift happened quietly around 2022, and people still haven’t internalized what changed.

Before that, data centers looked manageable from a utility perspective. Water usage grew, sure, but it followed a curve that utilities understand. Low double digit growth, predictable expansion, something you can smooth with planning assumptions and incremental infrastructure. It fit into the existing mental model of cloud growth, streaming, enterprise SaaS, which ends up in absolutely nothing breaking. Then ChatGPT shipped.
Inference didn’t turn off, model training runs stacked, capacity planning stopped being sequential, and became way more competitive. Everyone rushed at the same time, with cooling loads flattened into a baseline instead of peaking and falling. Thus, water usage stopped scaling smoothly and started stepping up.
That’s the important part. Not the absolute numbers, but the slope.
Utilities plan for smooth curves. Which lets municipalities plan for gradual change, and AI delivered a kink in the graph, which in turn is where the resource management systems fail.
This is also where the tech narrative makes itself stupid. People talk about efficiency gains as if efficiency solves absolute demand. It really doesn’t. Let me explain myself better, you can reduce water per unit of compute and still use more water overall if utilization goes to 24/7 and capacity keeps expanding. Which is exactly what happened. (lmao), efficiency is very irrelevant when demand is unbounded.
Now layer that onto the part nobody in tech likes to talk about: water law.
Water regulation in the US is the fucking Smithsonian. In huge parts of the country it still operates on rules written when horses mattered. Prior appropriation. First in time, first in right. Use it or lose it. Rights tied to historical agricultural usage, not economic output, not productivity, not modern industry. (This is also the same issue with water concessions in the West, not only tied to the USA)
That means water doesn’t flow to whoever can pay the most or use it most efficiently. It flows to whoever got there first. And changing that is not a market process, it’s a legal and political knife fight that takes years. This is where AI optimism tends to stop.
You can buy GPUs instantly, raise billions in a week, but you cannot manufacture new water rights on a timeline that matches AI scaling. Municipalities don’t move at venture speed, and the environmental reviews don’t compress because your inference costs went down. Public comment periods will not give a shit about the AI race.
Water constraints show up first in city council meetings, not in public earnings calls.
A data center shows up, everyone loves it at first because it provides jobs, taxes, and some level of prestige I still haven't got the hang of it. Then the water numbers get published and people realize one facility pulls as much water as a small town. A drought hits, residential users get asked to conserve. Suddenly the data center is the easiest villain in the room.
That’s when everything slows down.
Permits get revisited, conditions get added, usage caps quietly appear, and expansion phases turn into “we’ll review this next year”. None of this requires changing the law. Water regulation is full of discretion, and discretion is tends to be how backlash expresses itself.
This is the part no AI deck models. They assume capacity scales at the same pace of capital availability. In reality, capacity scales if permission is renewed over and over again, and that permission is local, political, and fragile.
This is why water doesn’t behave like electricity or bandwidth. You can’t ship it cheaply, can’t abstract it with software, nor arbitrage it globally. Every increase in usage is visible, local, and socially legible. Once residents feel it, the argument stops being technical and starts being emotional.
Which is not something hyperscalers are good at navigating.
Markets hate this kind of constraint because it doesn’t show up cleanly. It shows up as delays, caps, and missed internal targets that analysts wave away as execution issues. Over time those delays become scarcity, which shifts pricing power. Not to the model, to the chip, to the ONLY RESOURCE everyone needs permission to touch, water.
I’m not making a moral argument here, I still think a lot of sustainability discourse is empty calories. I’m not pretending to be converted. But today I'm preaching about risk and reward.
Right now intelligence captured the upside. Compute followed, semis ran, energy woke up late, and water barely moved. That’s not because it doesn’t matter. But more leaned towards being more local, regulated, and annoying to think about.

And after doing some reading, there's definitely a trade here.
So for 2026, I’m long water ($PHO). Explicitly, with my own capital. Utilities, treatment, reuse, control systems. The boring shit that determines whether AI capacity actually comes online or gets throttled by a zoning board.
I’m not short AI, Capa builds on this stuff, and we benefit from it.
I just don’t believe intelligence scales infinitely without paying rent to physics. I still think most of this is bullshit, but this particular bullshit has the right risk reward.
I studied sustainable development and I still think most of it is bullshit.
Not because the problems aren’t real, but because the way they’re usually framed is lazy. Too much morality, not enough incentives, and timelines. It always felt like a field that talked about outcomes without wanting to touch the systems that actually produce them. So I moved on, got interested in payments, liquidity, market structure, places where things actually break when scale shows up. AI has dragged me back kicking and screaming.
Not because I suddenly believe in the narrative, but because AI finally got big enough to collide with physical reality in public, and thermodynamics or physics don't negotiate with terrorists.
Once AI stopped being bursty and started being always on, it stopped behaving like software and started behaving like industry. Both heat and cooling became continuous instead of reactive, with water stopping to being a footnote and suddenly becoming a real constraint.
That shift happened quietly around 2022, and people still haven’t internalized what changed.

Before that, data centers looked manageable from a utility perspective. Water usage grew, sure, but it followed a curve that utilities understand. Low double digit growth, predictable expansion, something you can smooth with planning assumptions and incremental infrastructure. It fit into the existing mental model of cloud growth, streaming, enterprise SaaS, which ends up in absolutely nothing breaking. Then ChatGPT shipped.
Inference didn’t turn off, model training runs stacked, capacity planning stopped being sequential, and became way more competitive. Everyone rushed at the same time, with cooling loads flattened into a baseline instead of peaking and falling. Thus, water usage stopped scaling smoothly and started stepping up.
That’s the important part. Not the absolute numbers, but the slope.
Utilities plan for smooth curves. Which lets municipalities plan for gradual change, and AI delivered a kink in the graph, which in turn is where the resource management systems fail.
This is also where the tech narrative makes itself stupid. People talk about efficiency gains as if efficiency solves absolute demand. It really doesn’t. Let me explain myself better, you can reduce water per unit of compute and still use more water overall if utilization goes to 24/7 and capacity keeps expanding. Which is exactly what happened. (lmao), efficiency is very irrelevant when demand is unbounded.
Now layer that onto the part nobody in tech likes to talk about: water law.
Water regulation in the US is the fucking Smithsonian. In huge parts of the country it still operates on rules written when horses mattered. Prior appropriation. First in time, first in right. Use it or lose it. Rights tied to historical agricultural usage, not economic output, not productivity, not modern industry. (This is also the same issue with water concessions in the West, not only tied to the USA)
That means water doesn’t flow to whoever can pay the most or use it most efficiently. It flows to whoever got there first. And changing that is not a market process, it’s a legal and political knife fight that takes years. This is where AI optimism tends to stop.
You can buy GPUs instantly, raise billions in a week, but you cannot manufacture new water rights on a timeline that matches AI scaling. Municipalities don’t move at venture speed, and the environmental reviews don’t compress because your inference costs went down. Public comment periods will not give a shit about the AI race.
Water constraints show up first in city council meetings, not in public earnings calls.
A data center shows up, everyone loves it at first because it provides jobs, taxes, and some level of prestige I still haven't got the hang of it. Then the water numbers get published and people realize one facility pulls as much water as a small town. A drought hits, residential users get asked to conserve. Suddenly the data center is the easiest villain in the room.
That’s when everything slows down.
Permits get revisited, conditions get added, usage caps quietly appear, and expansion phases turn into “we’ll review this next year”. None of this requires changing the law. Water regulation is full of discretion, and discretion is tends to be how backlash expresses itself.
This is the part no AI deck models. They assume capacity scales at the same pace of capital availability. In reality, capacity scales if permission is renewed over and over again, and that permission is local, political, and fragile.
This is why water doesn’t behave like electricity or bandwidth. You can’t ship it cheaply, can’t abstract it with software, nor arbitrage it globally. Every increase in usage is visible, local, and socially legible. Once residents feel it, the argument stops being technical and starts being emotional.
Which is not something hyperscalers are good at navigating.
Markets hate this kind of constraint because it doesn’t show up cleanly. It shows up as delays, caps, and missed internal targets that analysts wave away as execution issues. Over time those delays become scarcity, which shifts pricing power. Not to the model, to the chip, to the ONLY RESOURCE everyone needs permission to touch, water.
I’m not making a moral argument here, I still think a lot of sustainability discourse is empty calories. I’m not pretending to be converted. But today I'm preaching about risk and reward.
Right now intelligence captured the upside. Compute followed, semis ran, energy woke up late, and water barely moved. That’s not because it doesn’t matter. But more leaned towards being more local, regulated, and annoying to think about.

And after doing some reading, there's definitely a trade here.
So for 2026, I’m long water ($PHO). Explicitly, with my own capital. Utilities, treatment, reuse, control systems. The boring shit that determines whether AI capacity actually comes online or gets throttled by a zoning board.
I’m not short AI, Capa builds on this stuff, and we benefit from it.
I just don’t believe intelligence scales infinitely without paying rent to physics. I still think most of this is bullshit, but this particular bullshit has the right risk reward.
No activity yet