Getting AI To Work by Brennan McDonald

Getting AI To Work by Brennan McDonald

What’s your AI budget for 2026?

Enterprise AI spend is exploding. The harder question is whether it will actually pay off.

Brennan McDonald's avatar
Brennan McDonald
May 04, 2026
∙ Paid

Hi there,

If you’re new here I’m Brennan McDonald and Getting AI To Work is about the people side of AI transformation. Because it’s not a technology problem, it’s a people one. Paid subscribers get access to a library of diagnostics and playbooks to apply to their own business and deeper insights.

In today’s newsletter:

  • Enterprise AI spend will keep rising and needs to pay off

  • Why change management matters more than ever

  • So what’s stopping you from getting AI to work?

Reply to this email or drop a comment or DM with feedback.

Share

Enterprise AI spend will keep rising and needs to pay off

How much is your business spending on AI each month? What is your AI budget for 2026?

Last week I shared the story of how Uber has already “blown” the budget it allocated for 2026 because of the insatiable demand for AI tokens especially from engineers.

When you add that to the enormous numbers of CapEx guidance that the Big Tech hyperscalers are planning, over USD700 billion for 2026, it’s clear that an enormous amount of money is going into enterprise AI spend across the whole stack.

Revenue from providing AI via the cloud platforms to enterprise clients is growing rapidly. The change in the details of the agreement between Microsoft and OpenAI has led to AWS Bedrock now offering the OpenAI models.

You can see the demand management strain in premium-model pricing, tighter subscription limits, and providers policing heavy or non-compliant use of flat-rate plans.

All of this combined is why my claim is that this isn’t a bubble. This is a massive infrastructure build-out. If you read any of the Wall Street research reports or independent research it goes into immense detail. It’s not just the AI labs. It’s the data centres, the power suppliers, and the chip manufacturers. It’s all of the associated products and services with servicing one goal: more compute.

This is all progressing much faster than any other technology. What this means is that some of the mental models we had around previous tech waves are probably wrong.

There’s another aspect of the cost of AI use which I think is massively underexplored. And that is the use of open source models that are much cheaper than the closed source models. My own experimentation has enabled me to build workflows that use cheaper models and get better outcomes through more structured prompts and context management.

That ability to much more granularly allocate a specific model for a specific subtask in a process means that the opposite of this enterprise AI spend is that a lot of people are going to be able to do more granular cost management to be able to keep up.

Another example is my recent testing of DeepSeek V4 Pro. It’s nearly at the frontier. If you read anything in the news claiming that Chinese open source models are years behind US models, you’re getting an outdated picture. The most powerful US models are still the best, but that margin is shrinking. For most daily business tasks, you probably don’t need Opus 4.7 or ChatGPT 5.5 Pro.

Again, what is your AI monthly spend or AI budget? Think about that number.

Why change management matters more than ever

Keep reading with a 7-day free trial

Subscribe to Getting AI To Work by Brennan McDonald to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2026 Digital Content Operations LLC · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture