8 min read
8 min read

Microsoft has made public remarks that it wants to increasingly use its own chips in data centers. The company currently uses a mix: Nvidia, AMD, and its own silicon. Its CTO, Kevin Scott, said Microsoft “absolutely” aims to mainly use its own chips in the longer term.
The idea is to reduce reliance on external GPU vendors. Microsoft has already built custom chips like Maia and Cobalt. This shift reflects growing demand, supply constraints, and cost pressures in AI infrastructure.

Relying on third-party GPUs has challenges: supply shortages, high cost, and lack of full control. Microsoft believes designing its own silicon allows optimization for its specific workloads. Own chips can be tuned for power efficiency, thermal handling, and integration with networking.
It could also reduce vendor lock-in. In internal discussions, Microsoft sees this as a way to scale AI compute more sustainably. The long-term vision includes the entire system design from chip to cooling.

The Azure Maia 100 AI accelerator is one of Microsoft’s own custom GPU-style chips. It is designed to support large language models and other AI workloads. Maia is intended to help alleviate the load on external GPU suppliers.
Microsoft announced it in 2023, and it forms part of its strategy to build internal AI capacity. Though early versions are behind Nvidia’s top GPUs in raw performance, they give Microsoft flexibility. Maia is a cornerstone of Microsoft’s chip self-sufficiency plans.

Alongside Maia, Microsoft developed the Azure Cobalt 100, an ARM-based CPU for its data centers. Cobalt handles non-GPU workloads and helps balance tasks in Microsoft’s cloud infrastructure. It supports Microsoft’s goal of designing its own compute stack more broadly, not just GPUs.
Having custom CPUs lets Microsoft better coordinate performance, power, and cost. It also gives Microsoft more control over the entire system. Cobalt is part of a dual-silicon strategy: both CPU and AI-accelerator designs in-house.

Microsoft evaluates chips based on “price performance”, how much compute you get for cost. So far, Nvidia GPUs have delivered strong value, especially for many AI workloads. Microsoft has said it will choose whatever gives the best cost vs performance.
Its in-house Maia chip is not yet matching Nvidia’s high-end models in many benchmarks. But as designs improve, Microsoft expects its custom chips to close the gap. The aim is to compete strongly enough that external GPUs are needed less often.
Microsoft also offers AMD’s MI300X chips on Azure to give customers choices beyond Nvidia. But its long-term plan is for its own silicon to handle more of Microsoft’s internal demand. Reducing dependency helps with supply risk and improves margin control.
It also gives Microsoft more strategic flexibility. Using AMD chips remains part of its cloud offerings in the near term. But custom designs are intended to gradually take over more workloads.

Microsoft CTO Kevin Scott has stated that in the long run, Microsoft wants its own chips to dominate its data centers. He emphasized that currently, the best price performance “has been Nvidia for years,” but Microsoft wants to change that.
Scott also mentioned that Microsoft’s system design includes not only chips but networking, cooling, and the server stack. The ambition is systemic, not just replacing GPUs. His vision is about autonomy and optimization, not loyalty to any vendor.

Microsoft isn’t only making chips; they are thinking about the entire system. That includes network cards (e.g., Maia projects), cooling technologies, and how racks or data centers are designed. For example, alignment between how chips produce heat and how cooling handles it.
Also, how the compute and networking interact for AI workloads. All this helps improve efficiency, reduce latency, and reduce the total cost of ownership. The end goal is a fully optimized AI compute stack.

Custom chips require custom supporting hardware, including networking and cooling. Microsoft is reportedly investing in novel cooling (such as microfluids) to deal with overheating and thermal bottlenecks.
Efficient networking is needed to move data between AI nodes without delays. Proprietary designs may allow better rack layouts, better fiber connections, and thermal management.
These improvements amplify the performance advantages of custom silicon. Without them, chips alone aren’t enough.

Making its own chips gives Microsoft greater control over supply chain issues. Fewer dependencies on third-party manufacturers or shortages affect operations. It also allows Microsoft to better manage cost, geographic risk, and vendor negotiations.
Owning more of the stack may help in times of global silicon shortages. It could reduce lead times and improve availability for internal AI infrastructure. Over time, that could improve reliability and reduce unexpected costs.

Building and deploying your own chips at scale is difficult. Microsoft will need to scale production to meet rising AI demand. This means investing in fabs, partnerships, testing, and manufacturing pipelines. Early chips like Maia and Cobalt are just a start.
To replace Nvidia/AMD in many workloads, massive scale and reliability are essential. Scaling also includes ensuring compatibility with existing AI software tools. Microsoft is working toward larger deployments, but it’s a long journey.

Nvidia has a multi-year head start in high-performance AI GPUs, with tools, a software ecosystem, and high compute densities. Microsoft’s self-developed chips currently lag in raw performance for many top workloads.
This gap needs closing in throughput, memory bandwidth, precision support, and power efficiency. Also, the software, drivers, and ecosystem need to mature. Microsoft faces engineering and fabrication challenges. Still, efforts like Maia are signs Microsoft is serious about closing that gap.

Building custom silicon is expensive; R&D, tooling, and fabs all cost large amounts. For some workloads, using third-party GPUs might still be more cost-effective. Microsoft will need to balance between investing in in-house designs and using vendor GPUs when that makes sense.
Sometimes the marginal gains won’t justify the cost. The cost of failures or delays also matters. Effective tradeoff planning will be essential in deciding which workloads to run on custom chips.

Microsoft’s Azure Maia 100 and Cobalt 100 are part of this roadmap and expected to enter production from 2025 onward, though newer designs have been delayed. Development of newer generations is underway. Microsoft expects, over time, more workloads to shift over to its own chips.
But Microsoft has acknowledged the compute crunch, demand outstripping supply. So while the shift is planned, it won’t happen overnight. Transition will likely take multiple years and gradual migration.

For Microsoft Azure customers, this shift could lead to a greater variety of chip options. Potentially lower costs if internal chips become efficient. Also, possibly better integration and optimized services. But there might be compatibility issues early on.
Customers might need to adapt to different precision modes or software stacks. Some workloads may still run better on vendor products until Microsoft’s chips reach parity. Azure partly offering AMD/Nvidia and Microsoft chip options helps buffer this.
Could simple misconfigurations be exposing millions? Discover why cloud breaches keep putting customer data at risk.

Microsoft’s goal is autonomy in hardware and compute infrastructure. If successful, Microsoft could reduce costs, improve performance, and control more of its AI destiny. This shift also pressures GPU vendors to continue innovating.
Microsoft’s move could influence other cloud providers to build their own chips. Ultimately, the AI hardware landscape may become more diversified. But risks are high and the path is complex.
Is this Malaysia’s bold move in the growing chip war? Explore how the new Malaysia rules target US AI chip imports.
Do you think Microsoft can realistically replace Nvidia/AMD GPUs with its own chips, or will it always need them as backup? Share your thoughts.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!