5 Comments
User's avatar
Martiel's avatar

I'd add that there's another way to create magic money, which is long term contracts.

Let's take the $300B contract between Oracle and OpenAI as an example, a "5 year deal for $300 million," which starts in 2027. It's a nice win for Oracle, they can have a splashy announcement and list that as remaining performance obligations.

That's much better for Oracle than just building the data centers and making money afterwards. That approach would show up as cost during construction, and then in annual revenue. And meanwhile Oracle gets a brand boost where they claim to be a leader in AI. And for that privilege they may even make it easier for OpenAI to get out of the deal.

Now, RPOs have value, but they are also another way to inflate numbers in order to impress investors and the public.

Expand full comment
JV's avatar
Oct 11Edited

"That means you would add an additional $230 per hour to the cost of running GPT4 inference. So the total would be ~$380 per hour, or roughly 3x the costs that Martin lays out."

You are adding the total training cost to the cost to operate a single inference cluster. Based on recent numbers OpenAI inference fleet is equivalent to thousands of such clusters. (Epoch said 480k H100 in their digital worker estimate.)

Even if the training cost was $100M and only 1000 clusters were running GPT4 for its lifetime of 20k hours, that adds just $5 per hour to costs.

Expand full comment
theahura's avatar

O good call out. Will and that to the article

Expand full comment
Kenny White's avatar

It seems to me that the revenue from operating an AI model is much higher than the cost of doing so. When you factor in the training costs? I'm wiling to bet that they'll find a way to bring down training costs so that they ultimately balance out with revenue.

The real crux of the issue is revenue growth. My understanding is that revenue is growing at 2x per year, which is pretty fast. But how long will that last? How long until the AI market is saturated and revenues start growing at rates closer to the rest of GDP? If revenue keeps growing fast for the next ~5 years, things will probably be ok. But if not? If growth stalls?

Expand full comment
theahura's avatar

I'm not sure training can get cheaper. I mean, it can if you fix model performance in time. As chips and energy get cheaper, building gpt4 will get cheaper. But they aren't using those chips and energy for gpt4, they're using it for GPT5/6/7. So the competitive pressure means all of these guys are overspending beyond what they hope to get as return

Expand full comment