Alphabet’s planned investment is enormous even by 2026 AI standards: $10 billion committed now, with another $30 billion tied to performance targets. Reuters says the deal comes as Anthropic’s run-rate revenue tops $30 billion and the company races to lock in more computing capacity after parallel deals with Amazon, Broadcom, and CoreWeave.
#compute
RSS FeedThis is not just another AI funding round. TechCrunch reports Google will put in $10 billion now at a $350 billion valuation, with as much as $30 billion more tied to Anthropic targets and 5 gigawatts of fresh compute over five years.
Why it matters: AI infrastructure is moving from single accelerator rentals to managed clusters that resemble supercomputers. Google Cloud said A4X Max bare-metal instances support up to 50,000 GPUs and twice the network bandwidth of earlier generations.
HN treated rising GPU costs as more than infrastructure trivia. If frontier access tightens and inference gets pricier, startups may have to compete on procurement, routing, caching, evaluation, and smaller-model strategy rather than assuming abundant calls to the strongest model.
Anthropic said on April 6, 2026 that it secured multiple gigawatts of next-generation TPU capacity from Google and Broadcom starting in 2027. The deal pairs infrastructure scale with surging demand, as run-rate revenue has passed $30 billion and million-dollar customers have doubled since February.
OpenAI said on March 31, 2026 that it closed a $122 billion funding round at an $852 billion post-money valuation. The company tied the raise to faster compute expansion, enterprise growth, and a unified AI superapp strategy spanning ChatGPT, Codex, and broader agent workflows.
OpenAI said it closed a $122 billion funding round on March 31, 2026 at an $852 billion post-money valuation. The company tied the raise to compute expansion, product development, and deeper enterprise and developer adoption.
OpenAI said on X that it closed a $122 billion funding round, then published a March 31, 2026 company post outlining an $852 billion post-money valuation and a broader infrastructure push. The announcement reinforces that compute access is becoming as strategic as model quality in the frontier AI race.
A Hacker News thread with about 240 points focused attention on Anthropic’s April 6 announcement that it signed for multiple gigawatts of next-generation TPU capacity with Google and Broadcom starting in 2027, alongside claims of more than $30 billion in run-rate revenue and over 1,000 seven-figure business customers.
OpenAI said on March 31, 2026 that it closed a $122 billion funding round at an $852 billion post-money valuation. The company paired the financing news with fresh scale claims including 900 million weekly active users, $2B in monthly revenue, and API throughput above 15 billion tokens per minute.
On March 6, 2026, OpenAI reposted a message from Sachin Katti saying construction is underway in Port Washington, Wisconsin. The post turns OpenAI’s previously announced Stargate and partner-led compute strategy into a visible on-the-ground build milestone.
NVIDIAAI says it is partnering with Thinking Machines to deploy at least 1 gigawatt of NVIDIA Vera Rubin systems for frontier AI training. Thinking Machines frames the partnership as infrastructure for both frontier model training and platforms for customizable AI.