<100 subscribers
Share Dialog
Share Dialog


We, the people existing in today’s world, are like babbling babies when it comes to sharing stories.
We like to blame the internet, or LLMs, or our failing education systems for it. And while there’s a component to that, the art of captivating an audience has measurably been in decline for generations before technology threw that decline into overdrive.
You've probably noticed AI-generated writing online—those telltale "vibrant tapestries" and bullet-pointed posts with excessive em-dashes that flood LinkedIn, Instagram, and Twitter. But what about audio content? Have you ever listened to a podcast or YouTube video and wondered if the person speaking was just reading from a ChatGPT script?
It’s become too commonplace for us to worry about how genuine a piece of content online really is; “Dead internet theory” and the like have primed us to be skeptical and overly judgmental of everything being put out online. Even when it's sometimes not the case, or even possible, you’ll find pre-2020 written pieces that sound eerily like GPT, or WatchMojo videos sounding all-too robotic to be an actual human voice. Is there a part of us that feels robbed when we give the gift of our connection and caring to something that turns out to be artificial?
When you stumble into what little valuable content remains on the internet, do you find yourself cherishing it more? Do you engage differently with a well-thought-out short or Substack post than you do with a Family Guy clip, a Reddit compilation, or a podcast snippet?
I think the biggest existential threat posed by AI is robbing us of caring and interest, but AI is far from the only culprit keeping us that way. Before the italian brainrot we had long been desentized to depth in social media.
Storytelling has been humanity's greatest talent since time immemorial. We admire the people who do it well, who do it cohesively, who speak from the heart with poetry and truth.
Never before had we been faced with the confusing experience of resonating with something that turns out to have been made with just a prompt.
When we connect with anything someone else says, we’re seeing ourselves in their words. We as a species value someone caring enough to bring a glimpse of their truth to the world. If we resonate with something, it's because we've felt it too, to an extent. And finding out someone prompted the answer to our conflict, or hurt, or challenge into the world "effortlessly" is, in a way, devaluing what we felt and went through.
No one likes to feel like they’re worth less. And similar to our caring for truly standout creativity, people’s acceptance of solutions and tools comes in a spectrum. Some people will feel like AI is an insult to human effort; others won’t mind, as long as they’re entertained, or the job they’re getting paid for is “done”.
The fact that we’re going through this right now is more of a reflection of how the past waves of tech have primed us not to care. AI wouldn’t be this impactful if our society had a higher bar for what makes the cut, if jobs hadn’t become robotic, if content mills hadn’t gotten us used to the basest levels of entertainment.
AI is useful, but human dedication is meaningful. I’ve battled with this conundrum for years.
I used to think AI’s role in empowering people was that of an accelerant. You got more done, you iterated quicker, you processed more information with less mental drain.
To put this idea to the test, I set out to write a book in 10 days using GPT back in 2022, with the one commandment of “It would be me writing the book, the ideas were mine, the AI would just help me get the bulk of the text in there so that I could tweak it”.
And yeah, I wrote the book, 100 pages of ideas I had been brewing inside me about the creative act, building your path, and finding the courage to share. But in the end, I felt empty in a way. I had put my thoughts out into the world, but they rang hollow. That wasn’t my voice that was saying what I felt deep within me.
I thought having my first book in the bag would give me the confidence to polish it and go on to write another, and then another after that. Quite the opposite.
Even then, I highlighted that the main outcome of the exercise was the people who came out to support me while I publicly posted about my process. Even if, in retrospect, and with a more finely tuned knack for detecting AI-written text, the words in that book come off as very obviously written by GPT. The Unwritten Authority gave me the very real lesson that the true path actually is the friends we make along the way
Me setting out to write a book with AI showed me that good things take time, and it taught me to appreciate the art of writing beyond just the ideas you’re sharing. The medium can sometimes be more important than the message, and that’s especially true in matters of conviction and putting your heart out into the world; so much so, that I’ve never used an LLM to aid me in my writing since.
So why are we struggling to see this crisis of depth in the work we put out into the world? I mean this from an AI-optimistic perspective. Why are we using LLMs and “AI” in ways it’s clearly not suited for?
In short, money.
We use AI because our ways of working incentivize the fact that there’s an output more than whether that output actually does something of value. The people posting AI edits of Marvel films or rehashing podcast clips are making bank cause they get paid per impression. We’re taking a machine gun approach to meaning, and the enjoyment we take out of life is suffering for it.
I don’t mean to come across as saying that AI is making us dumber, or less valuable, or less interesting. Those, in my opinion, are all temporary and not really based in reality. But the fact is, we are going through an awkward moment in time where we need to figure out how to use this tool properly.
Using any kind of tool to cut away what makes people special is an exercise in futility. Why take away the best that we have to offer when we could be using these exciting tools not to accelerate, but to curate?
I think the next generation of AI use-cases will not be used to “cut out menial labor”, but to find the most impactful and meaningful 10% to place your focus on. Einstein changed the face of modern physics while working at a patent office; who are we to say what kind of labor isn’t serving a larger purpose, even when tedious?
When sharing this article’s core ideas with friends, the idea of AI as an integrated tool kept coming back up. “AI is like the internet”, “AI is like a calculator”, “AI is like a car”: What we call AI today is nothing more than a machine built on people’s collective stories. And like any good story, it’s the uniqueness that counts.
A story will only land when properly told, to the right audience, in the right way, at the right time. That’s something an LLM will never be able to do because of the way it’s designed. If you aggregate every good story out there, all you end up with is a jumbled mess.
What it can do, though, is help you come up with the experiences to tell a good story in the first place. It can share sources with you, parse through data, collect useful context. All of these advantages have their place.
But if you let yourself become a speaker box just to participate in society the way it’s currently set up, you're depriving yourself of speaking from the heart.
People care about people. The thing that always ends up mattering in the end is that when we reach out into the void and ask for connection, we see another person in the echo that reaches back.
We, the people existing in today’s world, are like babbling babies when it comes to sharing stories.
We like to blame the internet, or LLMs, or our failing education systems for it. And while there’s a component to that, the art of captivating an audience has measurably been in decline for generations before technology threw that decline into overdrive.
You've probably noticed AI-generated writing online—those telltale "vibrant tapestries" and bullet-pointed posts with excessive em-dashes that flood LinkedIn, Instagram, and Twitter. But what about audio content? Have you ever listened to a podcast or YouTube video and wondered if the person speaking was just reading from a ChatGPT script?
It’s become too commonplace for us to worry about how genuine a piece of content online really is; “Dead internet theory” and the like have primed us to be skeptical and overly judgmental of everything being put out online. Even when it's sometimes not the case, or even possible, you’ll find pre-2020 written pieces that sound eerily like GPT, or WatchMojo videos sounding all-too robotic to be an actual human voice. Is there a part of us that feels robbed when we give the gift of our connection and caring to something that turns out to be artificial?
When you stumble into what little valuable content remains on the internet, do you find yourself cherishing it more? Do you engage differently with a well-thought-out short or Substack post than you do with a Family Guy clip, a Reddit compilation, or a podcast snippet?
I think the biggest existential threat posed by AI is robbing us of caring and interest, but AI is far from the only culprit keeping us that way. Before the italian brainrot we had long been desentized to depth in social media.
Storytelling has been humanity's greatest talent since time immemorial. We admire the people who do it well, who do it cohesively, who speak from the heart with poetry and truth.
Never before had we been faced with the confusing experience of resonating with something that turns out to have been made with just a prompt.
When we connect with anything someone else says, we’re seeing ourselves in their words. We as a species value someone caring enough to bring a glimpse of their truth to the world. If we resonate with something, it's because we've felt it too, to an extent. And finding out someone prompted the answer to our conflict, or hurt, or challenge into the world "effortlessly" is, in a way, devaluing what we felt and went through.
No one likes to feel like they’re worth less. And similar to our caring for truly standout creativity, people’s acceptance of solutions and tools comes in a spectrum. Some people will feel like AI is an insult to human effort; others won’t mind, as long as they’re entertained, or the job they’re getting paid for is “done”.
The fact that we’re going through this right now is more of a reflection of how the past waves of tech have primed us not to care. AI wouldn’t be this impactful if our society had a higher bar for what makes the cut, if jobs hadn’t become robotic, if content mills hadn’t gotten us used to the basest levels of entertainment.
AI is useful, but human dedication is meaningful. I’ve battled with this conundrum for years.
I used to think AI’s role in empowering people was that of an accelerant. You got more done, you iterated quicker, you processed more information with less mental drain.
To put this idea to the test, I set out to write a book in 10 days using GPT back in 2022, with the one commandment of “It would be me writing the book, the ideas were mine, the AI would just help me get the bulk of the text in there so that I could tweak it”.
And yeah, I wrote the book, 100 pages of ideas I had been brewing inside me about the creative act, building your path, and finding the courage to share. But in the end, I felt empty in a way. I had put my thoughts out into the world, but they rang hollow. That wasn’t my voice that was saying what I felt deep within me.
I thought having my first book in the bag would give me the confidence to polish it and go on to write another, and then another after that. Quite the opposite.
Even then, I highlighted that the main outcome of the exercise was the people who came out to support me while I publicly posted about my process. Even if, in retrospect, and with a more finely tuned knack for detecting AI-written text, the words in that book come off as very obviously written by GPT. The Unwritten Authority gave me the very real lesson that the true path actually is the friends we make along the way
Me setting out to write a book with AI showed me that good things take time, and it taught me to appreciate the art of writing beyond just the ideas you’re sharing. The medium can sometimes be more important than the message, and that’s especially true in matters of conviction and putting your heart out into the world; so much so, that I’ve never used an LLM to aid me in my writing since.
So why are we struggling to see this crisis of depth in the work we put out into the world? I mean this from an AI-optimistic perspective. Why are we using LLMs and “AI” in ways it’s clearly not suited for?
In short, money.
We use AI because our ways of working incentivize the fact that there’s an output more than whether that output actually does something of value. The people posting AI edits of Marvel films or rehashing podcast clips are making bank cause they get paid per impression. We’re taking a machine gun approach to meaning, and the enjoyment we take out of life is suffering for it.
I don’t mean to come across as saying that AI is making us dumber, or less valuable, or less interesting. Those, in my opinion, are all temporary and not really based in reality. But the fact is, we are going through an awkward moment in time where we need to figure out how to use this tool properly.
Using any kind of tool to cut away what makes people special is an exercise in futility. Why take away the best that we have to offer when we could be using these exciting tools not to accelerate, but to curate?
I think the next generation of AI use-cases will not be used to “cut out menial labor”, but to find the most impactful and meaningful 10% to place your focus on. Einstein changed the face of modern physics while working at a patent office; who are we to say what kind of labor isn’t serving a larger purpose, even when tedious?
When sharing this article’s core ideas with friends, the idea of AI as an integrated tool kept coming back up. “AI is like the internet”, “AI is like a calculator”, “AI is like a car”: What we call AI today is nothing more than a machine built on people’s collective stories. And like any good story, it’s the uniqueness that counts.
A story will only land when properly told, to the right audience, in the right way, at the right time. That’s something an LLM will never be able to do because of the way it’s designed. If you aggregate every good story out there, all you end up with is a jumbled mess.
What it can do, though, is help you come up with the experiences to tell a good story in the first place. It can share sources with you, parse through data, collect useful context. All of these advantages have their place.
But if you let yourself become a speaker box just to participate in society the way it’s currently set up, you're depriving yourself of speaking from the heart.
People care about people. The thing that always ends up mattering in the end is that when we reach out into the void and ask for connection, we see another person in the echo that reaches back.
No comments yet