Microsoft kicked off its annual software developer conference in Seattle on Monday, attracting thousands of programmers eager to explore new opportunities for monetizing the company’s hefty investments in artificial intelligence.
The event comes at a time when the tech giant is aiming to turn years of AI research and funding into viable consumer and business products, particularly through tools like Copilot, embedded in its popular Microsoft 365 suite.
So far this year, Microsoft has spent an estimated $64 billion, with a significant portion directed toward expanding its data centers—critical infrastructure for AI-driven services. A key player in the AI arms race, the Redmond-based company maintains a close partnership with OpenAI, the creators of ChatGPT. However, recent developments suggest a subtle but important recalibration of that relationship, as Microsoft explores more diverse collaborations in the AI ecosystem.
One notable shift occurred earlier this year when Microsoft allowed OpenAI to collaborate with Oracle on the massive “Stargate” data center project in Texas. Industry observers believe this signals Microsoft’s intent to become a more neutral “arms dealer” in the AI space, rather than exclusively relying on its current partnerships. CEO Satya Nadella has also expressed optimism about driving down AI-related costs by refining algorithms to achieve greater computational efficiency.
Demand for AI services through Microsoft’s Azure cloud continues to rise, with the company increasingly housing its most lucrative AI tools within its own infrastructure. According to Thomas Blakey, an equity analyst at Cantor Fitzgerald, this strategy allows Microsoft to retain control over revenue-generating services and continuously optimize performance without incurring extra infrastructure costs.
To handle surges in computing demand, Microsoft is turning to “neocloud” providers like CoreWeave, which specializes in AI-focused Nvidia chips. Blakey notes that rather than investing in more physical data centers, Microsoft is leaning on external providers for temporary computing power. “They’ve been consistent about shifting away from owning more dirt and cement,” he said, reflecting a broader trend toward leaner, more flexible infrastructure models in the AI era.
