Home > News > It

META: Don't Expect NVIDIA's Latest AI Chips Anytime Soon This Year

Fri, Mar 22 2024 07:21 AM EST

META, the parent company of Facebook, said they don't expect to get their hands on NVIDIA's latest flagship AI chips anytime this year.

NVIDIA, the leading designer of graphics processing chips widely used in cutting-edge AI models, unveiled its newest Blackwell architecture B200 chip at its annual developer conference on Monday.

The chipmaker claims its new B200 models can be up to 30 times faster than its older chips on certain tasks, like getting chatbots to answer questions. However, it didn't share specifics on how well they perform on training chatbots on large datasets, a key driver of NVIDIA's revenue growth.

"We expect it [B200] to start ramping in the back half of the year," NVIDIA's CFO, Colette Kress, told analysts on a conference call Tuesday, but added that shipments of the new chips wouldn't really ramp up until 2025.

Social media giant Meta, a major NVIDIA customer, has already bought hundreds of thousands of NVIDIA's H100 chips to upgrade its content recommendation systems and develop generative AI products.

Meta CEO Mark Zuckerberg said in January that the company planned to buy about 350,000 more H100 chips by the end of the year. That would give Meta the equivalent of about 600,000 H100 chips, including other models, he said.

In its latest statement, Meta said it will use the Blackwell architecture chips to train its LLaMA AI model. Meta recently announced two new GPU clusters training LLaMA 3 that it says each have about 24,000 H100 GPUs.

A Meta spokesperson said the company will continue to refine LLaMA 3 on those clusters and plans to incorporate the Blackwell chips in training future generations of the model.