Technology
Thursday, December 7th, 2023 2:56 pm EDT
Key Points
- Shift to AMD’s Instinct MI300X: Meta, OpenAI, and Microsoft announced their adoption of AMD’s newest AI chip, the Instinct MI300X, at an AMD investor event. This signals a significant move away from Nvidia, reflecting a trend among technology companies searching for alternatives to Nvidia’s expensive graphics processors, which have been crucial for AI program development, including OpenAI’s ChatGPT.
- Key Features and Performance: The MI300X is built on a new architecture, potentially leading to significant performance gains. A notable feature is its 192GB of HBM3 memory, a high-performance type that enables faster data transfer and accommodates larger AI models. Lisa Su, AMD’s CEO, directly compared the MI300X to Nvidia’s main AI GPU, the H100, emphasizing improved user experience and faster model responses.
- Adoption Challenges and Market Outlook: AMD faces the challenge of convincing companies heavily invested in Nvidia to add another GPU supplier. Su acknowledged that adopting AMD requires time and effort. To address this, AMD highlighted improvements to its software suite ROCm, competing with Nvidia’s industry-standard CUDA software. The success of AMD’s MI300X in the market will also depend on its pricing, with an emphasis on being more cost-effective than Nvidia’s chips, which can cost around $40,000 each. Some major companies, including Meta and Microsoft, have already committed to using the MI300X, indicating confidence in AMD’s offering. Despite modest sales projections, AMD believes the total market for AI GPUs could reach $400 billion over the next four years, showing the high expectations for high-end AI chips and AMD’s strategic focus on capturing a substantial share of this lucrative market. Su emphasized that while Nvidia currently dominates the AI chip market, AMD aims to secure a significant portion of the projected $400 billion market by 2027.
In a significant move away from Nvidia, Meta, OpenAI, and Microsoft announced at an AMD investor event their intent to utilize AMD’s latest AI chip, the Instinct MI300X. This marks a notable shift as technology companies explore alternatives to Nvidia’s expensive graphics processors, which have been pivotal for AI program development, including OpenAI’s ChatGPT. The adoption of AMD’s high-end chip by major players could potentially reduce costs associated with developing AI models and create competitive pressure on Nvidia’s robust AI chip sales growth.
AMD’s MI300X, scheduled for early next year, boasts a new architecture with a distinctive feature—192GB of cutting-edge HBM3 memory, facilitating faster data transfer and accommodating larger AI models. Lisa Su, AMD’s CEO, directly compared the MI300X to Nvidia’s main AI GPU, the H100, emphasizing the performance gains and improved user experience. The key question for AMD is whether companies heavily invested in Nvidia will commit the resources to integrate another GPU supplier. Su acknowledged the effort required to adopt AMD but highlighted enhancements to AMD’s ROCm software suite, addressing a critical shortcoming compared to Nvidia’s CUDA software, which is an industry standard in AI development.
Price considerations will play a crucial role, with AMD aiming to offer a cost-effective alternative. While the pricing for the MI300X wasn’t disclosed, Nvidia’s chips can cost around $40,000 each. Su stressed that AMD’s chip must be more cost-effective to purchase and operate than Nvidia’s to attract customers.
Several major companies have already committed to using the MI300X, signaling confidence in AMD’s offering. Meta and Microsoft, the top purchasers of Nvidia H100 GPUs in 2023, according to a recent report, have embraced the MI300X for AI inference workloads. Meta specified applications such as processing AI stickers, image editing, and supporting its assistant. Microsoft’s CTO, Kevin Scott, announced plans to provide access to MI300X chips through its Azure web service, while Oracle’s cloud will also integrate these chips. OpenAI expressed support for AMD GPUs in its software product Triton, utilized in AI research to access chip features.
Despite cautious sales projections for the MI300X, estimating around $2 billion in total data center GPU revenue in 2024, AMD recognizes the vast potential in the AI GPU market, projecting it to reach $400 billion over the next four years. This ambitious outlook underscores the growing significance of high-end AI chips and AMD’s strategic focus on capturing a significant share of this lucrative market. Su emphasized that while Nvidia currently dominates the AI chip market, there is room for multiple players, and AMD aims to secure a substantial portion of the projected $400 billion market by 2027.
For the full original article on CNBC, please click here: https://www.cnbc.com/2023/12/06/meta-and-microsoft-to-buy-amds-new-ai-chip-as-alternative-to-nvidia.html