Anthropic just announced a massive partnership with Google and Broadcom to secure "multiple gigawatts" of next-generation compute power. While the tech press obsesses over the scale, the real story is what this signals about AI becoming utterly dependent on industrial-grade infrastructure — and what that means for businesses trying to build AI into their operations.
The Infrastructure Arms Race is Real
This isn't just another corporate handshake. Anthropic is essentially admitting that running Claude at scale requires power infrastructure comparable to small cities. Multiple gigawatts means they're planning to consume enough electricity to power hundreds of thousands of homes, just to keep their AI models running.
Google brings its cloud infrastructure and energy management expertise, whilst Broadcom provides the specialised chips that make modern AI possible. It's a trifecta that signals AI companies are moving beyond software partnerships into industrial-scale resource planning.
What's telling is the timing. This comes as AI companies face mounting pressure over energy consumption and compute costs, with some services already implementing usage caps and throttling during peak times.
Computing Power Becomes the New Oil
For small businesses, this development reveals an uncomfortable truth: AI isn't becoming cheaper or more accessible. It's consolidating around companies that can afford industrial-scale infrastructure investments.
“The gap between AI haves and have-nots isn't about algorithms anymore — it's about who can afford the electric bill.”
We're seeing this play out in real time. Claude Code, Anthropic's coding assistant, has been locking users out for hours due to capacity constraints. When even a well-funded AI company struggles with demand, it suggests the computing bottleneck is more severe than most businesses realise.
This infrastructure arms race fundamentally changes how you should think about AI tools. The days of assuming unlimited access to powerful AI models are ending. We're moving into an era where AI capacity becomes rationed, premium-priced, or both.
Planning for the New Reality
The smart money isn't betting on AI getting cheaper — it's preparing for a world where AI access becomes stratified. Businesses that build their operations around unlimited AI usage may find themselves squeezed out when capacity gets tight or pricing shifts.
This doesn't mean avoiding AI tools. It means being strategic about which ones you depend on and having backup plans when your primary AI services hit capacity limits or raise prices.
Consider how this changes your competitive landscape too. Larger competitors with deeper pockets will increasingly have access to AI capabilities that smaller businesses can't afford. The democratisation of AI that everyone promised? It's not happening the way we expected.
What To Do About It
- 1.Audit your AI dependencies now. List every AI tool your business relies on and identify which ones are mission-critical versus nice-to-have. Build manual or alternative workflows for your most important processes.
- 1.Diversify your AI toolkit. Don't build your entire operation around one provider. Test alternatives like local models or different cloud services so you're not stranded when capacity issues hit.
- 1.Budget for AI cost increases. The current pricing for AI services likely represents the bottom of the market, not the long-term reality. Plan for 2-3x price increases over the next 18 months.
- 1.Consider hybrid approaches. Explore running smaller AI models locally for routine tasks, reserving cloud-based AI for complex work that truly needs the most powerful models.
- 1.Lock in long-term contracts where possible. If you've found AI tools that work well for your business, consider annual subscriptions or enterprise agreements before capacity constraints drive prices up.
https://www.anthropic.com/news/google-broadcom-partnership-compute
Published: 2026-04-07
https://github.com/anthropics/claude-code/issues/44257
Published: 2026-04-07
https://blog.xero.com/small-business-resources/south-african-tax-dates-2026-2027/
Published: 2026-04-07
GET THE WEEKLY BRIEFING
One email a week. What happened in tech and why it matters to your business.
NEED HELP WITH THIS?
That's literally what we do. Websites, automation, AI tools — one conversation, no jargon.
GET IN TOUCHMORE NEWS
Anthropic reduces cache TTL in March 6th system downgrade
Anthropic implemented a cache TTL downgrade on March 6th that could impact API response times and system performance for developers using their services.
US regulators summon bank executives over Anthropic AI cybersecurity risks
Federal regulators are calling in major bank leaders to discuss potential cybersecurity threats posed by Anthropic's newest AI model to financial systems.
MegaTrain enables full precision training of 100B+ parameter LLMs on single GPU
Revolutionary MegaTrain technique allows training massive 100+ billion parameter language models with full precision on just one GPU, democratizing AI development.