**MI300:** Is AMD Losing Ground to Nvidia with Slow Backlog Growth? Unlock the latest buzz on AMD’s MI300 and how it compares to Nvidia’s Blackwell GPUs!

Austin, Texas – Advanced Micro Devices (AMD) recently released their Q1 2024 earnings, providing updates on their MI300 AI accelerators. The company reported an increase in their 2024 backlog from $3.5 billion to $4 billion, indicating potential growth in data center GPUs for the upcoming year. However, concerns have been raised about the slow growth in AMD’s backlog, possibly indicating competitive challenges with Nvidia.

Nvidia’s recent announcement of their next-generation Blackwell GPUs and expanded AI software offerings may have shifted the competitive landscape against AMD’s MI300 GPUs. This could potentially impact AMD’s backlog growth and overall competitiveness in the market. As a result, there has been a downgrade in AMD’s rating from Strong Buy to Hold.

The MI300 outlook for 2024 suggests that AMD’s total sales are expected to increase quarter by quarter, with projections indicating a run rate of about $6.4 billion in Q4. Despite positive progress in 2024, there are concerns about AMD’s growth trajectory in 2025, particularly in sustaining revenue levels to support continued expansion.

The slow backlog growth in Q1 has raised questions about AMD’s ability to compete with Nvidia in the long run. The evolving competitive landscape and new offerings from Nvidia pose challenges for AMD’s future growth potential. With uncertainties surrounding AMD’s competitiveness in the market, investors are advised to adopt a cautious approach.

While AMD remains a formidable player in the chipmaker industry, concerns about their competitiveness in data center GPUs highlight potential risks for investors. As the year progresses, monitoring AMD’s performance and market dynamics will be crucial in evaluating its long-term prospects. In the midst of uncertainty, exploring alternative investment opportunities in the AI sector may provide a more stable approach for investors.