Google turns AI spend into margin: Cloud grows 63%, response costs fall 30%
Original: Q1 2026 earnings call: Remarks from our CEO View original →
Google just gave one of the clearest big-tech readings yet on whether AI spending is turning into operating leverage. In Alphabet’s Q1 2026 earnings remarks, Sundar Pichai said Cloud revenue grew 63% and crossed $20 billion for the first time, while backlog nearly doubled quarter over quarter to more than $460 billion. That is a much harder signal than generic talk about demand. It suggests enterprise AI infrastructure is already converting into large committed spending, not just pilot budgets and experimentation.
The supporting numbers were equally notable. Google said Gemini Enterprise paid monthly active users grew 40% quarter over quarter. First-party models now process more than 16 billion tokens per minute through direct customer API use, up from 10 billion last quarter. Paid subscriptions across products reached 350 million, driven mainly by YouTube and Google One. On the enterprise side, revenue from products built on Google’s gen-AI models grew nearly 800% year over year, and the company said new customer acquisition in Cloud doubled from the same period a year ago.
The efficiency side may matter even more than the growth side. Google said that after upgrading AI Overviews and AI Mode to Gemini 3, it reduced the cost of core AI responses by more than 30%. Search latency, meanwhile, is down more than 35% over the past five years. This is the economics the market has been waiting to see: not just better models, but cheaper unit costs at production scale. Google also said the number of deals worth between $100 million and $1 billion doubled year over year, and that it signed multiple deals above the billion-dollar mark.
Why this matters is straightforward. The AI race is shifting from headline model launches to proof that usage, cost compression, and enterprise contracts can reinforce one another. Google is now presenting AI as a business engine spanning Search, Cloud, subscriptions, and developer APIs at the same time. If those numbers hold, the debate around AI capex changes shape. The question stops being whether these companies are overspending and starts becoming who is best at turning compute, models, and distribution into durable margins. This quarter pushed Google firmly into that second conversation.
Related Articles
Google is signaling that enterprise AI is moving from demos to operational scale. In its April 22 Cloud Next update, the company said customer API traffic has risen to more than 16 billion tokens per minute and that just over half of its 2026 machine-learning compute investment will go to the Cloud business.
This is not just another AI funding round. TechCrunch reports Google will put in $10 billion now at a $350 billion valuation, with as much as $30 billion more tied to Anthropic targets and 5 gigawatts of fresh compute over five years.
Europe’s next AI fight on mobile is moving below the app layer and into Android itself. EU regulators are pushing Google to let rival AI services reach the same kinds of system capabilities that currently give Gemini an edge.
Comments (0)
No comments yet. Be the first to comment!