<100 subscribers


We are at a fork in the road for the internet.
One path is smooth, fast, and endlessly accommodating. It is optimized for engagement, generated on demand, and padded with synthetic voices that sound reassuring enough to keep us scrolling. The other path is slower, messier, and far more human. It requires participation, vulnerability, and something we’ve slowly stripped out of digital life: skin in the game.
Over the past several months building intori, I’ve become increasingly convinced that this fork is not theoretical. It’s already here.
We talk about “community” more than ever, yet most online spaces today feel hollow. They are populated, active, and algorithmically well-tuned, but they lack consequence. You can join, leave, perform, disappear, and reappear without anything really being at stake.
These are faux communities. They look alive but don’t ask anything of us. No commitment. No accountability. No cost to being wrong, dishonest, or extractive.
The problem isn’t that algorithms help us find people. The problem is that made-to-order social graphs with perfectly tailored feeds and endlessly reshuffled connections aren’t reflective of real life. In reality, our relationships are shaped by friction, time, shared interests, and mutual investment.
When everything is reversible and nothing is earned, connection becomes content.
Recently, Adam Mosseri, the head of Instagram, shared a 20-image deep dive about the coming flood of AI-generated content and what many are now calling AI slop. His point was not that AI content is inherently bad, but that authenticity is becoming the scarce asset.
When everything can be generated, what matters is what can’t be faked.
This resonates deeply with what I’ve been feeling while building intori. We are entering an era where synthetic output will vastly outnumber human signal. The danger isn’t misinformation alone, it’s meaning dilution. When real human expression is drowned out by automated noise, we lose the ability to tell what actually matters.
A core issue with today’s digital social systems is the absence of skin in the game. Likes are free. Follows are free. Opinions are free. Even identities are increasingly fluid or disposable.
But in the real world, relationships have cost:
Time
Reputation
Emotional risk
Opportunity cost
When those costs disappear, so does trust.
This idea has been echoed for decades by thinkers like Jaron Lanier, who has repeatedly warned that systems extracting value from humans without compensating them degrade both individuals and society. He’s argued that we need an honest human digital economy – one where real people are recognized, rewarded, and meaningfully represented, rather than flattened into data exhaust.
“If we cannot find a way to make digital dignity economically viable, we will continue to hollow out what it means to be human online.”
— Jaron Lanier (paraphrased)
The goal isn’t nostalgia for a pre-internet past. It’s progress toward systems that respect human input as something scarce and valuable.
Watching the internet evolve right now feels eerily similar to watching Stranger Things.
In the show, the “Upside Down” isn’t a different world, it’s a distorted mirror of the real one. Familiar shapes, inverted. Life drained out. Something hostile quietly spreading underneath everything we recognize.
AI-generated personas. Synthetic influencers. Auto-reply friendships. Algorithmic communities spun up and torn down without consequence.
This is our Upside Down.
The fracture isn’t just technological, it’s social. A split between real human connection and synthetic comfort. One path leads to a world padded with generated empathy and frictionless belonging, even as the underlying social fabric weakens. The other path is harder, but alive.
Here’s the hopeful part.
Despite everything, we’re seeing early signs of resistance and rebuilding.
Networks like Farcaster are experimenting with identity, portability, and composability in ways that make it harder to fake participation. The Base App ecosystem is pushing toward real ownership, not just of assets, but of presence and reputation.
These systems aren’t perfect. They’re early. Sometimes clunky. But they point toward something important: social spaces where being human actually matters again.
That’s the future intori is being built for.
Not a network that manufactures community, but one that helps surface real people through shared context, stated preferences, and earned signals. A layer that treats human data as something to be respected, not strip-mined. A place where connection is discovered, not fabricated.
We don’t need a societal stopgap made of synthetic personalities and disposable belonging. We don’t need to retreat into a world where everything feels comforting while slowly falling apart underneath.
There is another option.
A future where:
Real people stand behind their words
Relationships have context and continuity
Digital identity carries weight
Human signal is valued over infinite noise
This fork in the road is uncomfortable because it asks something of us. Participation instead of consumption. Presence instead of performance.
But that’s the side worth choosing!
Because on the other side of this moment isn’t a quieter internet. It’s a more honest one. And that’s where real connection, creativity, and progress still live.
We are at a fork in the road for the internet.
One path is smooth, fast, and endlessly accommodating. It is optimized for engagement, generated on demand, and padded with synthetic voices that sound reassuring enough to keep us scrolling. The other path is slower, messier, and far more human. It requires participation, vulnerability, and something we’ve slowly stripped out of digital life: skin in the game.
Over the past several months building intori, I’ve become increasingly convinced that this fork is not theoretical. It’s already here.
We talk about “community” more than ever, yet most online spaces today feel hollow. They are populated, active, and algorithmically well-tuned, but they lack consequence. You can join, leave, perform, disappear, and reappear without anything really being at stake.
These are faux communities. They look alive but don’t ask anything of us. No commitment. No accountability. No cost to being wrong, dishonest, or extractive.
The problem isn’t that algorithms help us find people. The problem is that made-to-order social graphs with perfectly tailored feeds and endlessly reshuffled connections aren’t reflective of real life. In reality, our relationships are shaped by friction, time, shared interests, and mutual investment.
When everything is reversible and nothing is earned, connection becomes content.
Recently, Adam Mosseri, the head of Instagram, shared a 20-image deep dive about the coming flood of AI-generated content and what many are now calling AI slop. His point was not that AI content is inherently bad, but that authenticity is becoming the scarce asset.
When everything can be generated, what matters is what can’t be faked.
This resonates deeply with what I’ve been feeling while building intori. We are entering an era where synthetic output will vastly outnumber human signal. The danger isn’t misinformation alone, it’s meaning dilution. When real human expression is drowned out by automated noise, we lose the ability to tell what actually matters.
A core issue with today’s digital social systems is the absence of skin in the game. Likes are free. Follows are free. Opinions are free. Even identities are increasingly fluid or disposable.
But in the real world, relationships have cost:
Time
Reputation
Emotional risk
Opportunity cost
When those costs disappear, so does trust.
This idea has been echoed for decades by thinkers like Jaron Lanier, who has repeatedly warned that systems extracting value from humans without compensating them degrade both individuals and society. He’s argued that we need an honest human digital economy – one where real people are recognized, rewarded, and meaningfully represented, rather than flattened into data exhaust.
“If we cannot find a way to make digital dignity economically viable, we will continue to hollow out what it means to be human online.”
— Jaron Lanier (paraphrased)
The goal isn’t nostalgia for a pre-internet past. It’s progress toward systems that respect human input as something scarce and valuable.
Watching the internet evolve right now feels eerily similar to watching Stranger Things.
In the show, the “Upside Down” isn’t a different world, it’s a distorted mirror of the real one. Familiar shapes, inverted. Life drained out. Something hostile quietly spreading underneath everything we recognize.
AI-generated personas. Synthetic influencers. Auto-reply friendships. Algorithmic communities spun up and torn down without consequence.
This is our Upside Down.
The fracture isn’t just technological, it’s social. A split between real human connection and synthetic comfort. One path leads to a world padded with generated empathy and frictionless belonging, even as the underlying social fabric weakens. The other path is harder, but alive.
Here’s the hopeful part.
Despite everything, we’re seeing early signs of resistance and rebuilding.
Networks like Farcaster are experimenting with identity, portability, and composability in ways that make it harder to fake participation. The Base App ecosystem is pushing toward real ownership, not just of assets, but of presence and reputation.
These systems aren’t perfect. They’re early. Sometimes clunky. But they point toward something important: social spaces where being human actually matters again.
That’s the future intori is being built for.
Not a network that manufactures community, but one that helps surface real people through shared context, stated preferences, and earned signals. A layer that treats human data as something to be respected, not strip-mined. A place where connection is discovered, not fabricated.
We don’t need a societal stopgap made of synthetic personalities and disposable belonging. We don’t need to retreat into a world where everything feels comforting while slowly falling apart underneath.
There is another option.
A future where:
Real people stand behind their words
Relationships have context and continuity
Digital identity carries weight
Human signal is valued over infinite noise
This fork in the road is uncomfortable because it asks something of us. Participation instead of consumption. Presence instead of performance.
But that’s the side worth choosing!
Because on the other side of this moment isn’t a quieter internet. It’s a more honest one. And that’s where real connection, creativity, and progress still live.
Share Dialog
Share Dialog
Hi : That is my thesis also, but I push it a bit further :-) https://paragraph.com/@antoinevergne/the-politics-of-presence
Your message was very beautiful. I have just joined the site and I am uploading a series of true historical stories from Asia that you have never seen before. If I may, I would like to know your opinion about my story. Your support is a great sign.
I’ve been thinking a lot about where the internet is heading as AI and synthetic content explode. Here’s my attempt to put words to a feeling I’ve had while building @intori. We’re at a fork in the road between synthetic comfort and real human connection. Would love thoughtful reactions, even disagreement.
agree with mostly everything here and am also encouraged by what farcaster and base are building. But just to test the ideas a bit - what if we already have solutions to these in the form of group chats. A lot of meaningful social conversations have moved from the town square to group chats where they feel more safe. Ideally we have a nice mix of town squares, “skin in the game networks”, and group chats
Agree group chats are a strong contender. But how does one get in and out of these. Club Ted group chat has been the best part of Farcaster for me during times of needing to step away from the feed. Just wish there were more organic ways to find more and move in and out.
agreed. Also too many group chats gets overwhelming and muted anyways