Was this helpful?
Thumbs UP Thumbs Down

AMD’s CEO sees AI chip demand soaring past $500 billion soon

AMD logo is displayed on a phone with CEO on a blurry background

Lisa Su said the chip demand would explode soon

AMD CEO Lisa Su predicts the demand for AI accelerator chips could top $500 billion in the next few years. This explosive growth is fueled by rising infrastructure needs from leading AI developers like OpenAI and xAI.

Su’s forecast marks a significant expectation shift, as previous market estimates were much lower. The projection also highlights AMD’s growing confidence in its Instinct series chips as genuine contenders in the AI infrastructure race.

Artificial intelligence technology CPU central processor unit

AI inferencing is driving the growth surge

Su emphasized that inferencing, not just training, is fueling AI chip demand. As more businesses deploy models in real-time services, inference needs far outpace initial training runs.

She believes this will create long-term hardware demand across cloud, edge, and enterprise environments. AMD is positioning its MI350 series to serve this rapidly expanding use case while continuing to develop higher-performance AI accelerators.

AMD logo displayed on a phone screen

AMD introduces powerful MI350 GPU line

At its “Advancing AI” conference, AMD launched the MI350 series, showcasing significant gains in compute efficiency. These chips reportedly deliver up to 40% more tokens per dollar and run inference tasks at significantly lower cost than competitors.

The MI355X variant, also introduced, is 35 times faster than its predecessor. With significant performance gains, AMD is aggressively targeting Nvidia’s stronghold in the AI GPU market.

Open AI logo displayed on a phone

Seven major clients are already using Instinct

AMD announced that seven of the top 10 AI companies are already deploying Instinct accelerators. Clients include OpenAI, Meta, Microsoft, Oracle, and India’s Reliance Jio.

This early adoption signals growing trust in AMD’s performance and value, despite Nvidia’s continued dominance. Su believes broader ecosystem traction will come from open standards and developer-first infrastructure.

Man holding electronic microchip.

The US needs to ramp up chip manufacturing

Su warned that the U.S. must dramatically scale manufacturing to meet the coming wave of AI chip demand. She cited the importance of public-private partnerships and praised the U.S. AI Action Plan as a strong blueprint.

With reshoring efforts underway, she urged faster permitting, subsidies, and policy support to secure supply chains and prevent foreign dependency.

Closeup view of a modern GPU card with circuit

Su backs AI diversity and chip variety

Su said the market is moving toward a Cambrian explosion of chip types. She expects purpose-built accelerators beyond just GPUs to serve new workloads ranging from scientific discovery to mobile apps.

AMD’s roadmap includes modular architectures and chiplets, aiming to meet the specific needs of different AI models, rather than a one-size-fits-all approach.

President of the United States Donald Trump

Trump AI policy supports U.S. growth

Su pointed to Trump’s AI Action Plan as a welcome policy framework. The plan includes measures to ban ideologically biased AI in federal systems and promote AI exports.

It also simplifies approvals for new data centers, which is crucial for AI model deployment. For AMD, this represents a favorable climate for expanding U.S. chip operations.

TSMC sign logo on headquarters in silicon valley of Taiwan

Reshoring raises costs by up to 20%

Su acknowledged that making chips at TSMC’s Arizona plant will cost 5% to 20% more than in Taiwan. However, she noted the tradeoff is acceptable for greater resilience and control over domestic supply chains.

The first batch of Arizona-made chips is expected by the end of 2025, aligning with growing U.S. reshoring efforts.

In this photo the AMD logo is displayed on smartphone

New rack-scale Helios platform unveiled

AMD introduced Helios, a full AI rack-scale platform aimed at hyperscalers. Launching in 2026, Helios integrates MI350 accelerators with networking, software, and management layers.

It’s designed to provide plug-and-play performance for cloud and enterprise customers. AMD says Helios will lower the total cost of ownership compared to existing AI infrastructure.

Developer working on a laptop

Open developer access is a key strategy

AMD believes that democratizing access to its AI tools is the way forward. Through its developer cloud access program, engineers, startups, and researchers can test Instinct accelerators firsthand.

This move empowers innovation by reducing entry barriers for smaller players. AMD also emphasized its commitment to open-source platforms and tools like ROCm.

By embracing openness and transparency, AMD hopes to cultivate a thriving community that continuously improves and adapts its technology for broader AI applications.

Su says $500B is just the starting point

While $500 billion might sound staggering, Lisa Su believes it could be the beginning. The demand for specialized hardware will keep climbing as AI becomes embedded in more industries, such as healthcare, finance, logistics, and creative sectors.

Su noted that even consumer-grade AI will require massive back-end inferencing power. With AI tools transitioning from novelty to necessity, the addressable market could eventually exceed previous forecasts.

AMD aims to capture this momentum with continuous hardware and software innovation.

Laptop displaying the logo of Nvidia

AMD aims to challenge Nvidia’s leadership

Although Nvidia remains the leader in AI accelerators, AMD has made strides to level the playing field. The MI355 chips not only outperform rivals on multiple fronts, but they also promise better energy efficiency and cost-effectiveness.

Su clarified that AMD isn’t just playing catch-up; it’s actively innovating to disrupt. With major hyperscalers now testing AMD hardware and with TCO advantages on its side, AMD is positioning itself as the cost-performance champion in AI compute.

Women interact with artificial intelligence

Inferencing is seen as a secular trend

AI workloads shift from training massive models to running them in production, often millions of times daily. This is where inferencing becomes essential. AMD is targeting this exact need with chips optimized for inference efficiency.

Lisa Su noted that the company’s roadmap is focused on adapting to this trend. As generative AI, chatbots, and assistants become part of everyday workflows, AMD believes inferencing accelerators will power the next growth phase in computing.

gpu cards preparing to mine cryptocurrency devices on mining rig

Crusoe orders 13,000 MI355X GPUs

Crusoe Energy, a fast-growing cloud infrastructure startup, made headlines with a $400 million purchase of 13,000 MI355X GPUs.

This is AMD’s most significant AI chip order and validates the performance promises made at its Advancing AI event. Crusoe plans to deploy these chips across data centers in North America for inference-intensive workloads.

The deal underscores growing customer trust in AMD’s roadmap and signals that even non-hyperscalers see AMD as a formidable Nvidia alternative.

konskie poland  august 07 2022 smartphone displaying logo of

Analysts raise AMD price targets

Following AMD’s announcements, several analysts raised their price targets on AMD stock. Evercore bumped its target to $144, Roth Capital to $150, citing strong MI350 series performance and a rapidly expanding AI customer base.

Analysts noted AMD’s improved ROCm software stack and growing influence in inferencing. With hyperscalers scaling up deployments and early benchmarks looking promising, Wall Street is becoming more optimistic about AMD’s potential to narrow the AI gap with Nvidia.

The momentum might not stop there, but AMD could have an arm-powered surprise on the way. Here’s what the latest leak reveals.

AMD building in Ontario, Canada

AMD sees this as a decade-long boom

Lisa Su believes that AI isn’t just another tech wave; it’s a generational opportunity that will shape the next decade of computing. Every stack layer will need smarter, more efficient chips, from data centers to edge devices.

AMD’s strategy is to win across layers: by offering scalable performance, open software, and modular design. As global demand surges and new applications emerge, AMD sees itself playing a central role in the ongoing AI revolution.

And while AMD bets big on AI, gamers aren’t sold on every move. Here’s why the RX 9060 XT is sparking debate.

What do you think about the AI chip surging demand in the coming days? Please share your thoughts and drop a comment.

Read More From This Brand:

Don’t forget to follow us for more exclusive content on MSN.

If you liked this story, you’ll LOVE our FREE emails. Join today and be the first to get stories like this one.

This slideshow was made with AI assistance and human editing.

This content is exclusive for our subscribers.

Get instant FREE access to ALL of our articles.

Was this helpful?
Thumbs UP Thumbs Down
Prev Next
Share this post

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!

Send feedback to ComputerUser



    We appreciate you taking the time to share your feedback about this page with us.

    Whether it's praise for something good, or ideas to improve something that isn't quite right, we're excited to hear from you.