TL;DR:
The contrast between first and second content squeezes
How LLMs fundamentally change the publisher-reader relationship
The emerging licensing model for major publishers
The particular threat to smaller publishers
Putting content onchain can solve many of these problems
The major unlock of the internet was reducing the cost of distribution to zero. Industries previously relied on supply chain and distribution control. The internet enabled a few key platforms to aggregate demand and capture middlemen-value as they direct this demand to supply.
The three big aggregators:
Google: aggregate information demand
Facebook: aggregate social connections
Amazon: aggregate product demand, "the everything store"
This is described in detail in Ben Thompson's legendary framework, Aggregation Theory.
I'll focus on the first one, information demand today.
Google's strategy simplified:
Provide great products for free (Gmail, Google Drive, and more), establish as a gateway to the internet. To understand how key this is, consider that Google pays Apple $20B+ per annum to remain default search engine on the iOS platform...
Capture intent via search
Sell intent-specific ads in the search results
This evolution of the internet marked the first phase of "the content squeeze."
As described in Ben Thompson's thesis, publisher content went from bundle product (newspaper) to fragmented pieces (articles). Rather than seek out publisher websites, users search for things on Google, and have links to relevant articles served to them.
But, they were still served by the publisher.
We're now entering the next phase of the squeeze.
With Large Language Model (LLM) products (like ChatGPT) on the rise, the window for online search shifts.
Traditional search querying:
Search
Get served relevant links
Click link and go to a site
LLM-based search:
Search
Get answer
It eliminates one step, and also enables the aggregator to keep users on their platform. And they have a clear incentive to do so. Distribution is the most valuable thing, and the more attention you're capturing and retaining, the more opportune your position to monetize it.
If the first phase of the content squeeze unbundled content, the second phase is breaking it down to its rawest form, to be served embedded.
This creates a new dynamic, and generates many interesting questions.
Monetisation is one such question. We're already seeing examples of the big AI companies landing licensing deals with publishers. Some notable ones:
New York Times signed a ≈$100M/3 year deal with Google (src)
Financial Times with OpenAI for $5-10M/year (src)
News Corp with OpenAI for $250M/5 year (src)
This is a very different way for publishers to monetise. In practice, they're almost making a full circle, coming back to selling a packaged bundle ("all content for x time"), but now it happens as a B2B backroom deal.
I obviously don't know the details of these deals, but I do wonder how good they will turn out to be after 3-5 years (and in whose favour).
And, what happens to the publisher-aggregator user experiences where there isn't a top-level content deal?
In a recent blog post, Tomasz Tunguz mentions Chegg, an online homework helper that's now suing Google because their traffic has dropped (so much that they're considering a sale of the company) because the Google AI Overview feature effectively gives users answers without actually going to Chegg's site.
These are all big publishers, what about everyone else?
Maintaining the content monetisation models and licensing deals manually, in a digital world where AI based search will take a firm hold, where AI agents will roam and use the information available to them to accomplish whatever goal they were served etc. is not scalable. It's too slow.
This will bleed over to commerce as well, as we've discussed in a few pieces lately.
The only scalable solution I see is embedding value with content. If licensing agreements are smart contract executable, attribution graphs are onchain, then value can be distributed, even streamed, as AI outputs generate income.
And, as commerce follows, running it all onchain will enable new types of ad markets to follow.
For now, many publishers are probably still underestimating the extent of which things will change when the second content squeeze takes proper hold. It creates more distance between publisher and end user. While it may be offset by these "catch all" licensing deals, it also begs the question; if distribution is the most valuable thing, is everything else just a race to the bottom?
As always, reach out to me on X or Farcaster if you have feedback or are working on something cool you want to discuss.
BRG
Over 100 subscribers