Discussion about this post

User's avatar
Martiel's avatar

I'd add that there's another way to create magic money, which is long term contracts.

Let's take the $300B contract between Oracle and OpenAI as an example, a "5 year deal for $300 million," which starts in 2027. It's a nice win for Oracle, they can have a splashy announcement and list that as remaining performance obligations.

That's much better for Oracle than just building the data centers and making money afterwards. That approach would show up as cost during construction, and then in annual revenue. And meanwhile Oracle gets a brand boost where they claim to be a leader in AI. And for that privilege they may even make it easier for OpenAI to get out of the deal.

Now, RPOs have value, but they are also another way to inflate numbers in order to impress investors and the public.

Expand full comment
JV's avatar
Oct 11Edited

"That means you would add an additional $230 per hour to the cost of running GPT4 inference. So the total would be ~$380 per hour, or roughly 3x the costs that Martin lays out."

You are adding the total training cost to the cost to operate a single inference cluster. Based on recent numbers OpenAI inference fleet is equivalent to thousands of such clusters. (Epoch said 480k H100 in their digital worker estimate.)

Even if the training cost was $100M and only 1000 clusters were running GPT4 for its lifetime of 20k hours, that adds just $5 per hour to costs.

Expand full comment
3 more comments...

No posts

Ready for more?