>37K subscribers


Wall Street is getting increasingly concerned that the current AI mania will burst and bring the entire market down with it. Silicon Valley brushes that concern off, and VCs and big tech companies continue to pour money into AI in search of big payoffs.
So who is right?
At times like this, I like to turn to the data and ignore the prognosticators.
Evan O'Donnell is a VC and blogger who took it upon himself to build a model that looks at the rate of growth of inference token usage and compares that to infrastructure investment and comes up with some answers. This post details that approach.
But what I like most is Evan's dashboard, which you can see here.

My only critique of this approach is that the data is not real-time. Not even close. When I asked Evan about that via email this past weekend, he said:
No material update on the token numbers.
As of Sept/Oct, token consumption is growing at ~13% monthly across providers (down from 30-40% earlier this year). I'm tracking everything here, under the Reported Token Growth table.
The big players only report these figures at earnings, so likely no major updates until Q1. Google’s next call (Feb) should be the best pulse... they've been consistent in reporting and represent a big share of the market (especially this quarter with the new Gemini release).
For a near-real-time pulse, the best proxy I've found is OpenRouter's weekly usage chart. In the last couple weeks, growth looks steady (in line with the Sept/Oct numbers). Just keep in mind it's tracking a thin slice of the market (1-2%). Directionally helpful, but not something I'd trade off of.
So where does this leave us?
The current infrastructure spend rates are justified if the current rate of AI usage continues. If the growth rates start to decline, there could be trouble.
So we should all be watching the numbers when they come in over the next quarter.
Until then the debate will rage on.
Wall Street is getting increasingly concerned that the current AI mania will burst and bring the entire market down with it. Silicon Valley brushes that concern off, and VCs and big tech companies continue to pour money into AI in search of big payoffs.
So who is right?
At times like this, I like to turn to the data and ignore the prognosticators.
Evan O'Donnell is a VC and blogger who took it upon himself to build a model that looks at the rate of growth of inference token usage and compares that to infrastructure investment and comes up with some answers. This post details that approach.
But what I like most is Evan's dashboard, which you can see here.

My only critique of this approach is that the data is not real-time. Not even close. When I asked Evan about that via email this past weekend, he said:
No material update on the token numbers.
As of Sept/Oct, token consumption is growing at ~13% monthly across providers (down from 30-40% earlier this year). I'm tracking everything here, under the Reported Token Growth table.
The big players only report these figures at earnings, so likely no major updates until Q1. Google’s next call (Feb) should be the best pulse... they've been consistent in reporting and represent a big share of the market (especially this quarter with the new Gemini release).
For a near-real-time pulse, the best proxy I've found is OpenRouter's weekly usage chart. In the last couple weeks, growth looks steady (in line with the Sept/Oct numbers). Just keep in mind it's tracking a thin slice of the market (1-2%). Directionally helpful, but not something I'd trade off of.
So where does this leave us?
The current infrastructure spend rates are justified if the current rate of AI usage continues. If the growth rates start to decline, there could be trouble.
So we should all be watching the numbers when they come in over the next quarter.
Until then the debate will rage on.
Share Dialog
AVC
Share Dialog
AVC
No comments yet