Home > News > It

Jensen Huang: New AI Chips Likely to Cost $30K-$40K, Nvidia Invested $10B in Development

Fri, Mar 22 2024 06:33 AM EST

Nvidia CEO Jensen Huang told The Verge in an interview Tuesday that the company's next-generation Blackwell architecture AI chips will likely be priced between $30,000 and $40,000.

"We had to invent a whole new kind of technology to build something this advanced," said Huang, holding a Blackwell architecture chip in his hand. He added Nvidia has invested nearly $10 billion in R&D for this series of chips.

Analysts say the price tag reflects the huge demand the new chips will likely see for training and deploying AI software like ChatGPT, putting them in the same price range as the previous generation H100 chips based on the Hopper architecture, which sold for $25,000 to $40,000. The Hopper architecture also saw a price increase over its predecessor when it launched in 2022.

Huang added that the price tag covers not just the cost of the chip itself, but also the cost of designing a data center and integrating it with other companies' data centers.

Nvidia releases a new AI chip roughly every two years, each time promising faster processing speeds and more power efficiency. The latest generation, like Blackwell, typically crams two separate chips into one, resulting in a heftier, more powerful package.

Since OpenAI debuted its ChatGPT chatbot in late 2022, the AI frenzy has fueled demand for Nvidia's AI chips, helping the company triple its quarterly revenue. Over the past year, Nvidia's H100 chips have been adopted by most of the top AI companies and researchers to train their AI models. Meta, for instance, announced earlier this year that it would buy hundreds of thousands of Nvidia H100 GPUs.

Nvidia isn't disclosing exact pricing for the chips. Different configurations will be available, and the final price for large end users like Meta or Microsoft will depend on factors like volume, whether they're buying directly from Nvidia or from AI service providers like Dell, Hewlett Packard Enterprise, or Supermicro. Some servers are reported to hold as many as eight AI GPUs.

On Monday, Nvidia unveiled at least three different configurations of Blackwell architecture chips: the B100, B200, and GB200, the latter of which combines two Blackwell architecture GPUs and an Arm architecture CPU. All three chips have slightly different memory configurations, and they're all expected to ship later this year.