6 min read
6 min read

For years, Nvidia’s graphics chips have been the default engine of the AI boom, but Google’s custom tensor processing units are suddenly stealing the spotlight.
After reports that Meta might spend billions on these chips, Nvidia’s stock slid while Alphabet’s jumped. Investors are waking up to a new reality where the hottest AI hardware no longer comes from just one supplier.

According to reporting by The Information and Reuters, Meta has been in talks to deploy Google’s TPUs in its data centers starting in 2027 and may rent chips through Google Cloud earlier.
Meta is one of Nvidia’s biggest customers, so even the possibility of switching some workloads sent a clear signal about shifting power in AI hardware.

Google announced the first Tensor Processing Unit in 2016 and said earlier prototypes had already been used inside its data centers for more than a year before the announcement. It needed something tightly tuned for deep learning inside products like Search, Maps, Photos, and Translate.
That in-house effort quietly evolved into multiple TPU generations, and now those same chips are being pitched as a serious alternative to Nvidia for outside customers.

Nvidia’s GPUs are flexible workhorses that can accelerate many kinds of computing, from gaming to AI research. TPUs are more like specialist sprinters, built primarily to process matrix math for deep learning.
That specialization can mean better performance per watt and potentially lower costs for training and running large models, especially at hyperscale. In simple terms, TPUs trade broad versatility for focused speed and efficiency.

Some reporting estimates that Google could capture about 10% of Nvidia’s annual data center revenue if TPU sales meaningfully expand to third-party buyers. That is not a total takeover, but for a company of Nvidia’s size, it is a meaningful slice.
If Google wins big customers like Meta, Anthropic, and heavily regulated financial firms, TPU sales could turn into a serious profit center rather than just an internal science project.

AI data centers are starting to look like power plants, and electricity is becoming a significant line item. TPUs are designed to squeeze more useful work out of each watt, especially for deep learning applications involving complex mathematical operations.
For hyperscalers that are spending tens of billions on AI infrastructure, even modest improvements in energy efficiency and chip cost can add up to very large savings across a data center fleet.

Even with the recent stock pullback, Nvidia remains the clear leader in AI accelerators. Its GPUs power the majority of large model training today, as well as much of the inference traffic.
Hyperscalers might experiment with TPUs and custom silicon, but ripping out Nvidia overnight is not a realistic option.
For now, demand is strong enough that multiple suppliers can thrive, even as investors brace for a more competitive future.

The hardware story misses a crucial detail. Nvidia has spent years building CUDA and a massive software ecosystem that developers already know and trust.
Most AI researchers can spin up code on Nvidia hardware quickly, while Google’s TPU tooling still feels more specialized.
Software familiarity is a significant reason Nvidia can continue charging premium prices, at least until rival platforms become equally mature and user-friendly.

Google is not starting from zero with its chip platform. Apple said it used chips designed by Google rather than Nvidia GPUs to train two components of its Apple Intelligence models, and Anthropic has announced plans to expand its use of Google Cloud TPUs to train and serve Claude models
You also see names like Salesforce and image generator outfits leaning in. Every high-profile deployment makes it easier for the next big customer to take TPUs seriously.

This is not just an Nvidia story. As Google pushes TPUs and Meta explores its own custom silicon, the window for other general-purpose players narrows. AMD had been seen as the primary challenger to Nvidia, yet its shares also slid on the TPU headlines.
Intel and various specialty chip firms now have to compete for attention in a market where the largest cloud platforms are increasingly designing or purchasing bespoke hardware.

One reason Nvidia’s share price is sensitive to any whiff of competition is its astonishing profitability. Nvidia reported gross margins around 73% in fiscal 2025, which are unusually high for the hardware industry.
If Google, Meta, and others prove they can do similar AI work with cheaper or more efficient TPUs and custom chips, customers gain leverage. The first real sign of disruption will be if Nvidia has to cut prices to defend market share.

All this is happening while high-profile investors are loudly warning about an AI bubble. When someone compares Nvidia to the dot-com era’s high flyers, traders are primed to overreact to any threat, including Google’s renewed chip ambitions.
In that environment, a single report about Meta’s plans can erase hundreds of billions in market value, even if Nvidia’s underlying business remains extremely strong for now.
If you want to see where Google is taking its AI strategy next, you’ll like this deep dive. Check out Google’s ads advisor AI, which takes center stage in a new test.

For me, the real test will not be one bad trading day for Nvidia. I am watching to see whether Meta actually deploys TPUs in production at scale, how many enterprise customers sign up for on-premises TPU deployments, and whether Nvidia’s margins begin to trend lower.
If Google keeps stacking big design wins while Nvidia’s pricing power softens, then this will look less like a scare and more like a genuine power shift.
And if you’re curious how Google is pushing its AI ecosystem even further, you’ll want to see how Students get free access to Google AI tools.
What do you think about Google making in-house AI chips and giving tough competition to Nvidia? Please share your thoughts and drop a comment.
Read More From This Brand:
Don’t forget to follow us for more exclusive content on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Father, tech enthusiast, pilot and traveler. Trying to stay up to date with all of the latest and greatest tech trends that are shaping out daily lives.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!