LaminiAI CEO Sharon Zhou shared exciting news that AMD has commenced shipping its Instinct MI300X GPUs, specifically designed for artificial intelligence (AI) and high-performance computing (HPC) applications. LaminiAI, a company set to utilize these accelerators for large language models (LLMs), is among the first to receive multiple machines equipped with these GPUs. The Instinct MI300X, a sibling of AMD’s Instinct MI300A, boasts impressive specifications, including a high number of CDNA 3 chiplets and 192 GB of HBM3 memory. Notably, the performance of the Instinct MI300X surpasses that of Nvidia’s H100 80GB, making it a promising competitor in the market. This development marks an important milestone for AMD as it strives to make a significant impact in the AI and HPC sectors.