This episode features AMD CEO Lisa Su discussing the AI revolution, focusing on the intense demand for high-end GPUs to power large AI models. She outlines AMD's strategy to compete with Nvidia through their new MI300 chip, addresses the chip supply chain challenges, and reveals how AMD is leveraging AI internally. A must-listen for anyone interested in the foundational hardware driving the AI boom.
Key takeaways
AMD's MI300 chip is designed to directly compete with Nvidia's H100 in performance for AI workloads; ecommerce operators should monitor this competition for future AI infrastructure cost and availability. If you are building AI into your commerce stack you should consider what hardware it runs on.
The current chip supply chain is largely balanced, with the exception of high-end GPUs for AI; this impacts all businesses relying on AI models, potentially leading to increased costs or limited access to advanced AI capabilities.
Companies like AMD are actively increasing manufacturing capacity for AI chips; this indicates an anticipated long-term surge in AI adoption and a future easing of supply constraints, which could benefit ecommerce businesses looking to integrate more AI.
AMD is integrating AI extensively within its own operations, from chip design to supply chain management; this highlights the practical application of AI for operational efficiency and innovation that other businesses can emulate.
Government initiatives like the CHIPS and Science Act are crucial in shaping the semiconductor landscape; ecommerce companies should be aware of these broader economic and technological forces that influence the tools and infrastructure they rely on.
Themes
ai & automationsupply chain & operationsfinance & fundraisingfounder & leadership
Today, we’re bringing you something a little different. The Code Conference was this week, and we had a great time talking live onstage with all of our guests. We’ll be sharing a lot of these conversations here in the coming days, and the first one we’re sharing is my chat with Dr. Lisa Su, the CEO of AMD. Lisa and I spoke for half an hour, and we covered an incredible number of topics, especially about AI and the chip supply chain. The balance of supply and demand is overall in a pretty good place right now, Lisa told us, with the notable exception of these high-end GPUs powering all of the large AI models that everyone’s running. The hottest GPU in the game is Nvidia’s H100 chip. But AMD is working to compete with a new chip Lisa told us about called the MI300 that should be as fast as the H100. You’ll also hear Lisa talk about what companies are doing to increase manufacturing capacity. Finally, Lisa answered questions from the amazing Code audience and talked a lot about how much AMD is using AI inside the company right now. It’s more than you think, although Lisa did say AI is not going to be designing chips all by itself anytime soon. Okay, Dr. Lisa Su, CEO of AMD. Here we go. Transcript: https://www.theverge.com/e/23658688 Links: AI startup Lamini bets future on AMD's Instinct GPUs Biden signs $280 billion CHIPS and Science Act Pat Gelsinger came back to turn Intel around — here’s how it’s going Huawei’s chip breakthrough poses new threat to Apple in China — and questions for Washington AMD expands AI product lineup with GPU-only Instinct MI300X Microsoft is reportedly helping AMD expand into AI chips US curbs AI chip exports from Nvidia and AMD to some Middle East countries Apple on the iPhone 15 Pro: 'It's Going to be the Best Game Console' Credits:
Decoder is a production of The Verge, and part of the Vox Media Podcast Network.
Today’s episode was produced by Kate Cox and Nick Statt and was edited by Callie Wright.
The Decoder music is
AMD's MI300 chip is designed to directly compete with Nvidia's H100 in performance for AI workloads; ecommerce operators should monitor this competition for future AI infrastructure cost and availability. If you are building AI into your commerce stack you should consider what hardware it runs on.
What does this episode say about supply chain & operations?
The current chip supply chain is largely balanced, with the exception of high-end GPUs for AI; this impacts all businesses relying on AI models, potentially leading to increased costs or limited access to advanced AI capabilities.
What does this episode say about finance & fundraising?
Companies like AMD are actively increasing manufacturing capacity for AI chips; this indicates an anticipated long-term surge in AI adoption and a future easing of supply constraints, which could benefit ecommerce businesses looking to integrate more AI.
What does this episode say about founder & leadership?
AMD is integrating AI extensively within its own operations, from chip design to supply chain management; this highlights the practical application of AI for operational efficiency and innovation that other businesses can emulate.
What does this episode say about ai & automation?
Government initiatives like the CHIPS and Science Act are crucial in shaping the semiconductor landscape; ecommerce companies should be aware of these broader economic and technological forces that influence the tools and infrastructure they rely on.