AMD Unveils Next-Generation AI Chips, Rivaling Nvidia’s Dominance

AMD Unveils Next-Generation AI Chips, Rivaling Nvidia's Dominance

Advanced Micro Devices (AMD) has revealed new details about its next-generation AI chips, the Instinct MI400 series, which are set to ship next year.

Helios Server Rack

The MI400 series will be assembled into a full server rack called Helios. This setup allows thousands of chips to be connected as one "rack-scale" system. According to AMD CEO Lisa Su, "For the first time, we architected every part of the rack as a unified system."

OpenAI’s Enthusiasm

OpenAI CEO Sam Altman attended the launch event and expressed excitement about using AMD’s new chips. He remarked, "When you first started telling me about the specs I was like there’s no way that’s just sounds totally crazy," and later added, "It’s going to be an amazing thing."

Hyperscale Clusters

The Helios setup simplifies the creation of hyperscale clusters of AI computers that can span entire data centers and utilize significant power. Su compared this setup to Nvidia’s Vera Rubin racks.

Competition with Nvidia

Nvidia is AMD’s primary competitor in the big data center GPU market for AI applications. OpenAI has been providing AMD with feedback on its MI400 roadmap, and the adoption of AMD chips by OpenAI could enhance credibility among potential customers seeking alternatives.

Nvidia has maintained market dominance since 2017, primarily due to its early release of essential software for developers.

Performance Claims

AMD claims that its latest generation Instinct ML355x processor outperforms Nvidia’s Blackwell processors, despite Nvidia’s use of proprietary CUDA software. The company asserts that its processor offers seven times more computing power than previous versions.

Lisa Su highlighted that while AMD has always known its hardware was strong, open-source frameworks have made significant advancements. She believes AMD AI chips excel in inference workloads compared to Nvidia’s offerings. Inference involves the deployment of chatbots or generative AI applications, which require more processing power than traditional server applications.

Competitive Landscape

AMD’s chips are expected to compete with Nvidia’s B100 and B200 chipsets released last year. Nvidia recently announced its own B600 series, which features up to twice as many compute units per chip. However, AMD representatives believe that Helios remains faster overall due to its ability to handle larger models with high-speed memory.

AMD also introduced an open-source networking solution called Ulink, which integrates its rack systems and allows users to access networking options from various vendors, including Intel Corp.

Market Outlook

The cost of the new microchips has not been disclosed, but AMD is taking measures to prevent customers from opting for rival products at lower prices in the coming years. Major cloud companies are expected to invest hundreds of billions of dollars in building new data center clusters around GPUs to accelerate AI model development. This includes $300 billion in planned capital expenditures by large technology companies this year alone.

AMD anticipates total earnings of nearly $500 billion in market value for artificial intelligence processors by 2028, although it has not specified what market share it expects to capture. The data center GPU business is currently the largest growth area for major industry players.

Andy Dieckmann, director of Data Center GPU product management, stated, "We don’t see any reason why we can’t compete against NVDA." He added that the current total addressable market is estimated to be over $20 trillion. With numerous potential customers seeking tech solutions, it will take time to reach this market.

Johnny Kung, vice president of marketing and sales, noted that at least one customer has already adopted the AMI355X for production, with several others planning to run their own instances starting in the third quarter.

FacebooktwitterlinkedinrssyoutubeFacebooktwitterlinkedinrssyoutube
FacebooktwitterredditpinterestlinkedinmailFacebooktwitterredditpinterestlinkedinmail

Leave a Comment

Your email address will not be published. Required fields are marked *