AMD releases new chips to power faster AI training

Key Points:

  • AMD has introduced new accelerators and processors tailored for large language models, showing its commitment to the AI chip market with improved performance and energy efficiency.
  • The company has established partnerships with Microsoft and Meta, as well as plans for the integration of Ryzen 8040 chips in products by major manufacturers, indicating a broad impact on the industry.
  • AMD’s strategic focus on AI chips reflects the competitive landscape and the growing demand for advanced AI processing capabilities, positioning the company as a key player in the AI chip arms race.

Summary:

AMD has announced new accelerators and processors, the Instinct MI300X accelerator and the Instinct M1300A accelerated processing unit (APU), aimed at running large language models (LLMs). These products offer better memory capacity and improved energy efficiency, positioning AMD as a significant contender in the AI chip market. AMD has partnered with Microsoft to provide MI300X in Azure virtual machines and announced deployments with Meta. The company also unveiled the Ryzen 8040 series that integrates neural processing units (NPUs) for enhanced AI performance in mobile devices and expects manufacturers to release products with these chips in 2024.

DAILY LINKS TO YOUR INBOX

PROMPT ENGINEERING

Prompt Engineering Guides

ShareGPT

 

©2024 The Horizon