Nvidia ( NVDA ) has struck an expanded multi-year data center deal with Meta ( META ) that will see the chip maker supply the social media giant with millions of Blackwell and Rubin GPUs.
And while that was certainly the most exciting part of Tuesday’s news, the companies said the deal will see Meta roll out processor-only Nvidia Grace servers in its data centers, the first large-scale deployment of the chips.
Grace is the processor that Nvidia pairs with two Blackwell or two Blackwell Ultra GPUs to form its GB200 and GB300 AI superchips.
The Grace-only servers come at a time when Nvidia is looking to capitalize on growing demand for traditional processors, as hyperscalers increasingly look to the chips to help power inference AI and agentic AI applications.
That spells trouble for Intel ( INTC ), which has long dominated the data center CPU space, and Advanced Micro Devices ( AMD ), which is working to take market share from Intel.
“Nvidia has been on a path to deliver more content in the data center for a while,” Gil Luria, managing director and chief technology officer at DA Davidson, told Yahoo Finance.
“The addition of Mellanox [a networking company Nvidia acquired in 2020] put them in the networking category as well,” he said. “So when they’re selling into the data center, they’re actually selling almost a large majority of the value. But it makes sense for them to further increase that value by adding processor capacity.”
Nvidia’s move couldn’t come at a worse time for Intel, which is facing capacity constraints that prevent it from producing enough processors to meet demand from data center builders.
It’s not just data centers though. Nvidia is also reportedly moving into Intel and AMD’s consumer business with its own laptop chip, creating a whole new headache for PC fans.
Nvidia’s move to sell processors doesn’t mean it’s giving up its massive GPU market lead. Nor is it a sign that the AI GPU market is on its last legs. Rather, it’s about capitalizing on a growing trend in the AI industry to use processors to power smaller AI models.
Giant AI models like the latest and greatest frontier models from OpenAI (OPAI.PVT), Google (GOOG, GOOGL) and Anthropic (ANTH.PVT) will still need the kind of horsepower that only a GPU can provide. But the processors steal back a bit of the spotlight from those smaller models.
The processors are also a bottleneck for the AI supply chain, one of many choke points in the ongoing development of AI that could hurt Nvidia’s sales over time. By bringing its own processors to the table, Luria said, Nvidia is doing everything it can to maintain sales.