Your AI Budget Is Structured Wrong

Most executives treat AI as a cost line. That’s the wrong mental model, and it’s already creating the wrong decisions.

A venture capitalist recently described burning 250 million tokens in a single day, twenty times the amount he had used six weeks earlier. He was running five parallel agent flows. One pulling code history. Another querying error logs. A third building a presentation. Two more checking and critiquing the work. All at once, all in the background.

This isn’t isolated. Combined hyperscaler capex hit $112B last quarter, and Google Cloud customers overran their committed AI spend by 45%, and accelerating. 330 enterprise accounts are now each consuming more than a trillion tokens. Once models hit production, consumption scales exponentially. The budget can’t hold its shape.

The economics here aren’t cost-per-query. They’re compute deployed against your work queue. On METR’s time-horizon benchmark, the latest agents now complete tasks that take human experts 12 hours, up from 1 hour a year ago.

So the relevant question is no longer “what does a prompt cost?” It’s “how much parallel compute are we running against our outstanding work?” And to answer that, you need a reliable measure of the work itself.

This is a productivity architecture decision, not a software procurement one. CFOs optimizing for cost-per-seat or query volume are measuring the wrong variable. The organizations getting ahead aren’t spending less on AI. They’re deploying it differently. Token spend, deliberately structured across parallel workstreams, signals how seriously a business is using what it’s bought.

When did you last review your AI investment as a productivity deployment decision rather than a license management exercise?


References: Tom Tunguz, “The $112 Billion Quarter” (April 2026); METR Time Horizon tracker, with methodology in “Time Horizon 1.1” (January 2026).

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.