Americas

  • United States

Asia

AMD and Intel take on Nvidia with new AI chips and pricing strategies

news
Jun 04, 20244 mins
CPUs and Processors

Intel's Lunar Lake AI chip and AMD's Instinct MI325X accelerator are set to offer stiff competition to Nvidia's dominance in AI chips market.

The Computex trade show in Taipei has become the battleground for AI hardware giants as AMD and Intel unveil new products and strategies, challenging Nvidia’s market dominance.

AMD has announced the launch of its Instinct MI325X accelerator, set for release in the fourth quarter of 2024. This will be followed by the MI350 series, based on the new CDNA 4 architecture, expected in 2025.

The company claimed the MI350 will offer a 35-fold increase in AI inference performance over its predecessor, the MI300 series. AMD also plans to introduce the Instinct MI400 series, leveraging its future CDNA “Next” architecture.

Meanwhile, Intel revealed its Lunar Lake client processor architecture, aiming to expand its footprint in the AI PC category.

Intel also rolled out its Xeon 6 processors designed to enhance performance and power efficiency for data center operations. The company also announced significantly lower pricing for its Gaudi 2 and Gaudi 3 AI accelerator kits compared to similar products from Nvidia.

These announcements closely follow Nvidia CEO Jensen Huang’s disclosure that Nvidia’s next-generation AI chip platform, Rubin, slated for a 2026 release, will include new GPU, CPU, and networking chips.

Competing with Nvidia

Analysts say that Nvidia’s Rubin platform can outperform chips from rivals AMD and Intel in terms of raw power. However, the market positioning and unique selling points of AMD and Intel’s offerings will be key factors in determining their competitiveness against Nvidia’s technology.

“Intel, for instance, has been aggressive with its pricing, significantly undercutting Nvidia,” said Akshara Bassi, senior research analyst at Counterpoint Research. “The Gaudi 3 is priced at about $125,000 and Gaudi 2 at $65,000, compared to Nvidia’s solutions that can cost over $300,000. This pricing makes Intel an attractive option, especially considering the total cost of ownership and performance per watt, which are critical for many businesses.”

This means that for foundational AI workloads involving trillions of parameters, Nvidia would remain the top choice due to its superior computing power. However, for more specialized tasks, AMD and Intel may provide cost-effective alternatives.

This strategy allows each company to play to its unique selling points, according to Manish Rawat, a semiconductor analyst at Techinsights.

“Nvidia seeks to bolster its ecosystem and innovation leadership, while AMD aims to capitalize on its roadmap and partnerships,” Rawat said. “Intel focuses on cost efficiency, reliability, and strategic alliances to regain market share in this competitive landscape. Each company’s approach underscores their distinct strengths and challenges in shaping the future of AI and computational technologies.”

Industry trend toward inference

The increased prevalence of AI infrastructure could lead to a stronger emphasis on inferencing. In fact, Nvidia recently reported that about 40% of its AI chip revenue comes from inferencing.

“We believe this will become more skewed towards inferencing as AI integrates deeper into business systems and processes,” Bassi said. “Such integrations will require real-time results, whether inferencing at the edge or at central cloud solutions, depending on how AI workloads and use cases evolve.”

From a hardware perspective, the market may also see a rise in custom chips. Many hyperscalers have announced their custom AI hardware solutions. For instance, Amazon offers the likes of Tranium, Inferentia, and Graviton. Microsoft has Cobalt and Meta offers MTIA.

“We anticipate more solutions in the future where cloud providers will offer services powered by their own chips,” Bassi said. “This trend of custom chip solutions by hyperscalers is expected to continue.”

Expect stiffer competition

The advancements from AMD and Intel in the AI chip market are set to intensify competition, fostering innovation and potentially accelerating technological progress and overall performance improvements.

“Intel’s aggressive pricing strategy may induce Nvidia and AMD to reevaluate their pricing models, potentially making AI hardware more accessible, which could benefit smaller firms and startups in the AI sector,” Rawat said. “As competition heightens, there could be increased industry consolidation and strategic partnerships among semiconductor companies to capitalize on synergies and address market gaps effectively.”

Finally, the introduction of AI capabilities into PCs, exemplified by AMD’s AI PC chips, signals a broader trend toward integrating AI functionalities into various devices.

“This expansion is likely to stimulate growth in the AI software ecosystem, as well as in complementary sectors such as cloud computing, edge computing, and IoT, broadening the applications and impact of AI technology across industries,” Rawat added.