Was this helpful?
Thumbs UP Thumbs Down

Amazon’s custom hardware is rising fast, and Nvidia may feel the impact

amazon logo with its signature orange smile on the glass
large aws sign and logo amazon web services is a

Amazon Expands Its Presence in Custom AI Chips

Amazon Web Services is no longer just renting out Nvidia chips; it is designing its own. At AWS re:Invent 2025, AWS unveiled Trainium 3, a custom accelerator designed for training and running large AI models in the cloud.

The message is subtle but powerful: Amazon wants more control over its AI stack and less dependence on outside silicon suppliers.

computer microchip

Trainium 3 reduces training costs and power consumption

AWS states that Trainium 3 can reduce the cost of training and serving AI models by up to 50 percent compared to equivalent GPU-based systems.

AWS reports that Trainium 3 delivers up to about 4.4 times the compute performance of Trainium 2 while improving energy efficiency by roughly 40 percent. For customers wrestling with eye-watering AI bills, that combination of speed and efficiency is complex to ignore and very appealing.

Nvidia headquarter

Big cloud customers love performance at the right price

Inside AWS, the mantra is clear: you do not need to beat Nvidia on every benchmark if you can win on economics. Executives openly say they must prove that their chips deliver strong performance at the right price.

That framing tells us how Trainium will be sold as the practical, cost-efficient workhorse for many workloads, rather than a flashy spec sheet champion.

Claude logo displayed on a phone

Trainium already powers serious real-world AI models

These chips are not just lab toys. Amazon is using Trainium to train its own Nova family of AI models and to power customer services. Anthropic is scaling Project Rainier on AWS and plans to use more than one million Trainium 2 chips to build and deploy Claude in production.

When big-name AI labs bet on your silicon, it signals to the rest of the industry that your hardware is truly production-ready.

mistral french ai company logo on screen march 2 2025

Amazon builds a full-stack AI ecosystem around chips

Custom hardware is only one layer of Amazon’s strategy. On top of the hardware sits Amazon Nova, the in-house model family, and Amazon Bedrock, which provides a single interface to call models from Anthropic Mistral and other third-party providers, as well as Amazon’s own models.

The long-term goal is evident to me: AWS aims to be the platform where companies can access chips, models, and tools within a single, integrated cloud environment.

Cloud customers increasingly want Nvidia alternatives

The timing could not be better for Amazon. Nvidia’s top GPUs are incredibly powerful but also expensive and often in short supply. As AI demand explodes, many companies are wary of being locked into a single supplier.

Custom chips from hyperscalers like AWS and TPUs from Google give them negotiating leverage and technical options, which is precisely what procurement teams crave right now.

Nvidia's stock prices on phone screen

Nvidia still dominates but feels the competitive heat

Despite all this, Nvidia is far from doomed. It still serves as the reference platform for many cutting-edge models and is expected to generate hundreds of billions of dollars in chip bookings through 2026. However, when Amazon claims it can halve AI costs with Trainium 3, traders take notice.

Nvidia shares often wobble on these announcements, reflecting genuine concern that its margins could face pressure.

amazon logo with its signature orange smile on the glass

Trainium four will link directly with Nvidia systems

The relationship is not purely adversarial. Amazon has already announced that its next chip, Trainium four, will support Nvidia NVLink Fusion, a high-speed interconnect.

That means future AWS clusters will mix custom silicon with Nvidia hardware in the same fabric. This is a classic coopetition move; Amazon reduces its reliance, while Nvidia stays deeply embedded in the most advanced cloud architectures.

boulder colorado usa  september 26 2022 google company sign

Nvidia leans on neutrality in a divided market

One quiet advantage Nvidia still holds is neutrality. Cloud providers such as Amazon, Google, and Microsoft compete fiercely with one another. Some customers will hesitate to build critical systems around a rival’s proprietary chips.

Nvidia, positioned in the middle as an independent supplier, can continue to sell to all sides. That makes its ecosystem resilient even as hyperscalers roll out more homegrown silicon.

Attendees listening to speaker's discussion in a conference

Amazon positions itself as an AI industrial platform

By pairing Trainium with its massive cloud footprint, AWS is pitching a new industrial era for AI, not just a toolkit. Executives talk about creating the compute fabric for the AI industrial revolution.

In practice, this means tightly integrating chips, networking, storage, and models, so customers can scale quickly without having to stitch together parts from multiple vendors themselves.

dhaka bangladesh  13 march 2025 the logo of the

Bedrock turns AWS into an AI model marketplace

Bedrock is a crucial piece in this puzzle. It enables customers to call Nova, Claude, Mistral, Google, and other models through a single AWS interface, all running on Amazon’s infrastructure, making it much easier for developers to experiment and standardize.

For Nvidia, it is a mixed picture; it still powers many of those models, but the brand relationship increasingly flows through AWS first.

Close up shot of a Nvidia gaming processor

Custom chips signal a shift in the power dynamics of AI

The rise of Trainium three and other proprietary accelerators marks a more profound shift. The old model was simple: buy GPUs from Nvidia and rent them out at a markup.

Now, the most significant cloud providers want to design, own, and optimize their own silicon. That change moves more of the value chain in-house and inevitably chips away at Nvidia’s long-term pricing power.

And if you want to see how cloud giants are already reshaping Nvidia’s global footing, take a look at Amazon and Microsoft’s back campaign to cut Nvidia’s China exports.

AWS headquarter

Why this matters for the next phase of the AI race

In the short term, Nvidia and Amazon will continue to succeed together, as many flagship models still rely on Nvidia hardware. But over the next decade, the balance could tilt.

If AWS proves its chips are good enough for most workloads at far lower cost, more customers will quietly migrate. That is how disruption often starts, not with a crash, but with a slow, relentless shift.

And if you want to see another way Amazon is reshaping its future stack, take a look at Amazon expands robotics program, raising concerns over warehouse jobs.

What do you think about Amazon’s custom chip, which is giving competitors a tough time and dominating the AI market? Please share your thoughts and drop a comment.

Read More From This Brand:

Don’t forget to follow us for more exclusive content on MSN.

If you liked this story, you’ll LOVE our FREE emails. Join today and be the first to get stories like this one.

This slideshow was made with AI assistance and human editing.

This content is exclusive for our subscribers.

Get instant FREE access to ALL of our articles.

Was this helpful?
Thumbs UP Thumbs Down
Prev Next
Share this post

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!

Send feedback to ComputerUser



    We appreciate you taking the time to share your feedback about this page with us.

    Whether it's praise for something good, or ideas to improve something that isn't quite right, we're excited to hear from you.