Who Pays for the Next Scoop?
There is an absolutely profound shift happening to the Internet, and it’s happening faster than most people realize. The shift from search-based discovery to AI-generated answers is already underway, and it’s upending how content is consumed. My guess is it will very quickly change the way that content is valued as well.
I’ve written about this shift before, but some stats a good friend sent me yesterday brought it all into sharp focus. Matthew Prince, CEO of Cloudflare, recently gave an incredible interview:
📊 10 years ago: Google crawled 2 pages per visitor
📊 6 months ago: Google 6:1, OpenAI 250:1, Anthropic 6,000:1
📊 Now: Google 18:1, OpenAI 1,500:1, Anthropic 60,000:1
The entire talk is absolutely worth your time, but that last number should stop you cold. Anthropic is now crawling 60,000 web pages for every one human visitor it drives to a site. That’s not indexing for search. That’s extraction at a profound scale. It means their AI systems are consuming the internet not just to help users find content—but to replace it with synthesized answers. What’s staggering is how fast this shift happened. 6 MONTHS!
To put it plainly: AI is now by far the dominant “reader” of the internet.
It means the audience for content isn’t people—it’s AI. It means creators are being mined for training data, often without credit, compensation, or even awareness. It means the traditional model—publish online, rank in search, monetize through traffic—is being quietly, but quickly dismantled.
I don’t want to spin this entirely in a negative light. Part of this shift is fascinating and absurdly promising. AI has the potential to help people synthesize complex topics, pull together perspectives, and learn more efficiently. I’ve always thought of the internet as a massive database of all the acquired knowledge from human history. Now we have the most powerful tool ever invented to take that knowledge and present it in a concise, practical way. Done right, this innovation has the potential to expand our understanding of complex topics in a way that we’ve never seen before.
But there are two massive risks we can’t ignore:
- Is our understanding of complex topics only going to run an inch deep? When people get used to reading two-paragraph summaries, they may feel informed, but will we actually be able to understand the deeper context or nuance?
- Garbage in, garbage out. These AI systems are only as good as the material they’re trained on. If the source content is weak, biased, or misinformed, then so are the answers they generate. Imagine an AI trained entirely on twitter posts. Good Lord. And what kind of hellscape will we see when AI is trained entirely on AI?
Which brings us to the heart of the problem: Who is producing the real, original, verifiable information?
Right now the answer is reporters, researchers, writers, analysts—the people who call sources, travel to war zones, attend hearings, follow paper trails. AI CAN NOT do this work.
But here’s the real kicker - the business model that has sustained those people is getting upended, while AI companies raise billions using their work as fuel. Media companies are signing deals with OpenAI and others—$25 million here, $50 million there—to license their archives. But the writers and journalists who created that work haven’t been part of the deal.
I think that’s incredibly short sighted.
Because if we don’t figure out how to support and protect the people doing primary source work, we’re going to be left with a world where AI is confidently summarizing everything. And as someone who has interacted a lot with AI lately, I can tell you first hand that AI will confidently tell you an answer, whether it is right or not. What kind of information will it have to summarize?
That’s why we started LedeWire.
We believe the creators doing original work should have more control, more visibility, and a better path to being paid. We’re building tools to help them monetize directly, outside of the ad-driven or full subscription ecosystem. And we’re working on ways to ensure that if their work is being used to train models, they can benefit from that too.
AI is not going away. But neither is the need for real, on-the-ground, human-powered journalism. We just need to make sure the future doesn’t forget who made the internet worth reading in the first place.
— Morgan