Microsoft outbuys tech giants with massive Nvidia AI chip orders

Microsoft has secured its spot at the top of the AI hardware food chain, outspending every tech giant in the game and buying 485,000 Nvidia Hopper chips this year. This leaves competitors like Meta, Amazon, and Google gasping for air in the AI arms race.

Meta came closest but still trailed far behind, with 224,000 chips. Amazon and Google? Not even close, with orders of 196,000 and 169,000 Hopper chips respectively.

Nvidia’s GPUs have become Silicon Valley’s gold standard, driving a frenzy of data center investments to support cutting-edge AI models like ChatGPT. Microsoft, with its Azure cloud infrastructure, is going all in, betting big on AI as the future of tech.

Microsoft’s giant leap

Nvidia’s Hopper chips, hailed as the crown jewels of AI processing, have been the hottest tech commodity for two years. They’ve become critical tools for training and running next-gen AI models.

Analysts at Omdia estimate that the company’s purchases more than tripled compared to last year, a direct response to skyrocketing demand following ChatGPT’s breakout success.

“Good data center infrastructure… takes multi-years of planning,” said Alistair Speirs, senior director of Azure Global Infrastructure at Microsoft. The company is also positioning itself to dominate the cloud market, renting out its hardware to customers through Azure.

Omdia’s analysis shows that Microsoft’s Hopper haul dwarfs the 230,000 chips bought by ByteDance and Tencent combined. However, the chips shipped to China were modified H20 models—downgraded to comply with US export restrictions.

Nvidia tweaked these chips to limit their capabilities, but even with reduced power, Chinese companies are snapping them up.

The AI spending spree

This year alone, tech companies have shelled out a jaw-dropping $229 billion on servers, with Microsoft leading the charge. The company’s $31 billion capital expenditure makes it the biggest spender in the global AI infrastructure boom, overshadowing Amazon’s $26 billion.

According to Omdia, Nvidia GPUs accounted for 43% of server spending in 2024, cementing the company’s dominance in AI hardware. Nvidia’s Hopper chips may be the star, but Microsoft isn’t putting all its eggs in one basket. It has begun rolling out its own Maia chips, installing 200,000 units this year.

Meanwhile, Amazon has been ramping up production of its Trainium and Inferentia chips, deploying 1.3 million units to fuel its AI ambitions. It’s even building a new AI cluster featuring hundreds of thousands of Trainium chips for Anthropic, an OpenAI competitor it has backed with an $8 billion investment.

Meta and Google are also doubling down on their in-house chips, each deploying around 1.5 million units of their custom designs.

A Step-By-Step System To Launching Your Web3 Career and Landing High-Paying Crypto Jobs in 90 Days.

Source: https://www.cryptopolitan.com/microsoft-massive-nvidia-ai-chip-orders/