Microsoft is moving to strengthen its independence in artificial intelligence by developing its own cutting-edge foundation models, a shift that could gradually reduce its reliance on long-time partner OpenAI.
The strategy was confirmed by Mustafa Suleyman, head of Microsoft AI, who said the company must build frontier-level models using large-scale computing infrastructure and advanced training capabilities.
Why Microsoft wants its own AI core
Foundation models are large, general-purpose AI systems trained on massive datasets and computing power. These models serve as the underlying engine for products such as Microsoft Copilot, enterprise tools and developer platforms.
By developing its own models, Microsoft aims to gain tighter control over three critical factors:
- Cost: Running AI at scale requires expensive GPU resources, and reducing dependence on external providers could lower long-term operational expenses.
- Roadmap control: Internal models allow Microsoft to move faster and prioritize features aligned with its products.
- Risk management: Relying heavily on a third-party AI provider exposes the organisation to strategic and reputational risks.
The company also holds vast proprietary data through services such as Microsoft 365, LinkedIn and Bing, which can be used to improve its own training and fine-tuning efforts.
A shift toward self-sufficiency, not a full break
The move does not signal an immediate split from OpenAI. Instead, analysts describe the approach as a gradual “decoupling,” where Microsoft builds internal capabilities for core functions while continuing to use external models where appropriate.
The shift comes as the economics of generative AI come under greater scrutiny. Large models require continuous investment in computing infrastructure, and rising demand has made GPU capacity a major cost driver across the industry.
Microsoft has already adjusted aspects of its partnership, giving OpenAI more flexibility to seek computing resources beyond Microsoft’s cloud—another sign of a maturing relationship.
Copilot performance remains critical
One key challenge for Microsoft will be ensuring that users see no decline in quality as internal models take on a larger role. Products built around Copilot have become central to the company’s AI strategy, and any noticeable difference in capability could affect adoption.
AI strategy enters a new phase
The shift reflects a broader trend across the technology sector as companies move from early partnerships and rapid deployment toward long-term control of core AI infrastructure.
For Microsoft, the goal is clear: maintain the benefits of its early lead in AI while ensuring that the technology powering its ecosystem is increasingly built—and controlled—in-house.
