If I asked you what the native business model for content is, you'd probably say either advertising or subscriptions. But I am starting to think that AI is to content what search engines are to browsers. Money machines.
I was emailing with my friend Lock a few weeks ago and we were talking a bit about my 2024 predictions post. I made reference to the section in that post about AI and litigation and said:
maybe we will get a settlement that makes all the big AIs pay 2/3 of their revenue to content companies and writers and we will have the native revenue model for media!
I was only half joking.
When the Hollywood writers went on strike, I suggested to all of my writer friends that they should be happy to let AIs write films and TV shows as long as they get paid to sit home and do nothing under the premise that the AIs were trained on their work and so they are due royalties.
I was only half joking about that too.
In Chris Dixon’s book, Read Write Own, which I wrote about a few weeks ago, he said this:
Most current AI systems have no economic model for creators... in the long run, we are still going to need an economic covenant between AI systems and content providers. Al will always need new data to stay up to date. The world evolves: tastes change, new genres emerge, things get invented. There will be new subjects to describe and represent. The people who create content that feeds AI systems will need to be compensated.
I could not agree more.
The only question is how this will come to be.
I think web3 is sitting on the answer.
Tune in tomorrow for more on this.

21 comments
Heck yeah @fredwilson.eth, I'm into this vision: "But I am starting to think that AI is to content what search engines are to browsers. Money machines." Excited to read part 2 https://avc.xyz/the-native-business-model-for-content
You should put this on @kiwi https://news.kiwistand.com/new
Thanks for the prompt. I think I was that in the feed last week 🙂
Ah, OK. It's growing! I used to be able to track all links in my head :D
Hi Casters. I've been thinking a lot about the various intersections between AI and Web3 and have a series of posts lined up for this week on the topic. Here is the first of them https://avc.xyz/the-native-business-model-for-content
I’ve been thinking a lot in this area too. Excited to read what have lined up. Commenting so I can come back to this in a bit.
@cameron up your street (is this a phrase in the US?)
Thanks for the tag! Unfortunately, i think there is a labor cohesion problem among creators where the truly independent thinkers and creators who ostensibly could benefit the most from their unique work training AI are also the least likely to work together to fight the consolidated legal fight Also…
That phrase is definitely legible but i think the common US phrasing is “Up your alley”
Interesting — why do you think that? I think of Holly Herndons thing as the biggest example of ‘artists collaborating together’
Looking forward to more posts in this series as I’ve been thinking a lot about this too. I really enjoy the AIxCrypto framing on the a16z podcast https://open.spotify.com/episode/4z4eF9UgXwmuUrHgCQEfsE?si=0clQ3L_mRy2DOIGzKPGpPw
Nice piece. The world will be divided into two camps - those who fear AI and those who embrace it. Collected 🫡
Please not one more debt to the past. We have been gating creativity with copyrights for 100 years. The only ones who make a real profit out of such schemes are brokers and middlemen and a very small percentage of creators.
Keep reading. More tomorrow
This is what has me excited about web3 gaming. We can have decentralized gaming studios where we can easily compensate people for their assets using crypto and iterate on games using AI? Sweet. I still haven’t seen models good enough for my dream of the next Pixar being 10 people, but we’ll see.
100% agreed
@fisioyoga
Sorry, no. This whole idea of AI models having to pay royalties makes no sense considering what AI models actually do with the data they are trained on. And the better the AI models get (the more they generalize) the less it makes sense. It only makes sense for broken, overtrained models.
Also, giving an example from the non-AI world: I'm a professional educator. I teach students stuff they then apply in their jobs for the rest of their lives. Nobody would ever argue the students need to pay me royalties every time they use one of my strategies for data visualizations/story telling/... Makes no sense.
Why should AI be different? Why is AI not allowed to learn in the same way humans learn, by processing tons of material (text, images, audio, whatever) and trying to see patterns and relationships?
Thanks for sharing here!