Nvidia’s latest AI chip will cost more than $30,000, CEO says

Technology
Tuesday, March 19th, 2024 8:31 pm EDT

Key Points

  • Nvidia’s next-generation AI graphics processor, Blackwell, will be priced between $30,000 and $40,000 per unit, with CEO Jensen Huang highlighting the innovation required and an estimated $10 billion in R&D costs.
  • The pricing aligns Blackwell with its predecessor, the H100 (Hopper), indicating high demand for AI software training and deployment like ChatGPT, with the Hopper generation witnessing a significant price increase over the previous one.
  • Nvidia’s AI chips, driving a tripling of quarterly sales since the AI boom in late 2022, have been utilized by major AI companies like Meta for training AI models. The cost involves not only the chip but also data center design and integration, while Nvidia’s chip pricing varies based on factors like volume and purchase configuration. Additionally, Nvidia announced three versions of the Blackwell AI accelerator, expected to ship later this year, with varying memory configurations.

Nvidia’s CEO, Jensen Huang, revealed on CNBC’s “Squawk on the Street” that their upcoming next-generation AI graphics processor, named Blackwell, will be priced between $30,000 and $40,000 per unit. Huang emphasized the necessity of inventing new technology to enable Blackwell’s capabilities, estimating Nvidia’s expenditure on research and development at around $10 billion. This pricing aligns Blackwell with its predecessor, the H100 (Hopper), which also fell within a similar price range of $25,000 to $40,000 per chip. Huang later explained to CNBC’s Kristina Partsinevelos that the cost encompasses not just the chip itself but also factors like data center design and integration. Nvidia typically releases new generations of AI chips every two years, with each iteration offering improved performance and energy efficiency. Blackwell, combining two chips and physically larger than its predecessors, aims to continue this trend. Nvidia’s AI chips have significantly contributed to the company’s sales growth, particularly since the AI boom in late 2022 when OpenAI’s ChatGPT was introduced. Major AI companies like Meta have heavily relied on Nvidia’s H100 GPUs for training AI models. Nvidia’s chip pricing varies depending on factors like volume, purchase configuration, and whether bought directly or through vendors like Dell, HP, or Supermicro. Recently, Nvidia announced three versions of the Blackwell AI accelerator—B100, B200, and GB200—each with slightly different memory configurations, expected to ship later this year.

For the full original article on CNBC, please click here: https://www.cnbc.com/2024/03/19/nvidias-blackwell-ai-chip-will-cost-more-than-30000-ceo-says.html