OpenAI’s Financial Landscape: Revenue Sharing and Rising Costs
After a year filled with frenzied dealmaking and speculation about an impending IPO, scrutiny around OpenAI’s financials is now intensifying. Recently, leaked documents obtained by tech blogger Ed Zitron unveiled crucial insights into OpenAI’s revenues and compute costs over recent years.
Revenue Share Payments and Financial Trends
This week, Zitron reported that Microsoft received approximately $493.8 million in revenue share payments from OpenAI in 2024. This figure surged to an impressive $865.8 million in the first three quarters of 2025, based on documents he reviewed.
OpenAI reportedly shares 20% of its revenue with Microsoft as part of a deal following a substantial investment of over $13 billion from the tech giant into the AI startup. However, details surrounding this percentage remain unconfirmed by both OpenAI and Microsoft.
Interestingly, the financial relationship between Microsoft and OpenAI is complex. Microsoft also shares revenue with OpenAI, returning roughly 20% of the revenues generated from services like Bing and the Azure OpenAI Service. These platforms leverage OpenAI’s models for their functionalities, creating a symbiotic relationship.
Understanding the Revenue Dynamics
A source familiar with the situation informed TechCrunch that the leaked payments represent Microsoft’s net revenue share, excluding whatever was paid to OpenAI from Bing and Azure royalties. This nuance complicates the picture, as Microsoft withholds those figures from its reported revenue shares.
Despite this opacity, these leaked insights offer a revealing glimpse into one of the most sought-after companies in the private market, showcasing not only its revenue but also its expenditures in relation to that income.
Estimating OpenAI’s Revenue and Expenses
Based on the reported 20% revenue-share statistic, it can be inferred that OpenAI’s revenue reached at least $2.5 billion in 2024, and $4.33 billion during the first three quarters of 2025. Previous analyses from The Information pegged OpenAI’s 2024 revenue at around $4 billion and projected $4.3 billion for the first half of 2025. Sam Altman, OpenAI’s CEO, also hinted that the company’s annualized revenue run rate might exceed $20 billion by year’s end and could reach an astonishing $100 billion by 2027.
However, running these advanced AI models requires significant computational resources. Zitron’s analysis suggests that OpenAI may have spent approximately $3.8 billion on inference costs in 2024, with that number rising to about $8.65 billion in the first nine months of 2025. This computation expense covers the processing power required to run trained models, effectively generating responses.
The Cost of Computing Power
Traditionally, OpenAI has relied heavily on Microsoft Azure for compute access, although the company has recently explored partnerships with other cloud providers, such as CoreWeave, Oracle, AWS, and Google Cloud. Prior reports estimated OpenAI’s total compute expenditure at around $5.6 billion for 2024, with a reported cost of revenue of $2.5 billion during the first half of 2025.
A source indicated to TechCrunch that while OpenAI’s training costs largely consist of credits awarded by Microsoft, its inference expenses predominantly involve cash payments. This reality raises concerns: open-source availability and the computational demands might mean that OpenAI is expending more on inference than it is earning in revenue.
The Broader Implications for the AI Landscape
This financial predicament contributes to ongoing discussions about the AI bubble. If a model powerhouse like OpenAI is still running in the red, what implications does this have for the vast investments and somewhat inflated valuations seen across the AI industry?
OpenAI declined to comment, and Microsoft has not responded to inquiries from TechCrunch, leaving many questions lingering regarding this high-stakes partnership.
If you have sensitive information or confidential documents related to the AI industry, please reach out to reporters Rebecca Bellan at rebecca.bellan@techcrunch.com or Russell Brandom at russell.brandom@techcrunch.com. For secure communication, you can also contact them via Signal at @rebeccabellan.491 and @russellbrandom.49.
For more detailed insights, you can follow the full story Here.
Image Credit: techcrunch.com






