Nvidia to invest up to $100 billion in OpenAI to expand ChatGPT's computing power
Partnership will add at least 10 gigawatts of Nvidia AI data centers to accelerate OpenAI’s infrastructure for its ChatGPT platform and other models

Nvidia will invest up to $100 billion in OpenAI as part of a strategic partnership designed to dramatically scale the computing power behind ChatGPT and OpenAI’s broader AI stack. The alliance includes plans to deploy at least 10 gigawatts of Nvidia-powered data-center capacity dedicated to OpenAI’s infrastructure, with the first gigawatt to be rolled out in the second half of 2026 on Nvidia’s Vera Rubin platform. The arrangement is structured as two intertwined transactions: OpenAI will purchase Nvidia chips with cash to power its systems, while Nvidia will take a non-controlling equity stake in OpenAI as the partner supplies the hardware and engineering access needed to run OpenAI’s models at scale. The companies said the deal, disclosed via a letter of intent, is expected to be finalized in the coming weeks.
The partnership represents a major step in the ongoing race to lock in the compute resources that underpin cutting-edge AI development. OpenAI was previously reported to be valued at about $500 billion, underscoring the strategic importance of access to high-end accelerators as model sizes and workloads continue to expand. The first tranche of Nvidia hardware purchases could begin once the definitive agreement is in place for OpenAI to procure Nvidia systems, according to people familiar with the discussions. Nvidia has indicated that hardware deliveries would begin on or after late 2026, aligning with its long-range delivery schedule for the Vera Rubin platform.
Everything starts with compute, and OpenAI CEO Sam Altman has stressed that robust compute infrastructure is the foundation for future AI breakthroughs and scalable deployment. In a statement accompanying the LOI, Altman said the partnership would enable OpenAI to scale its capabilities while continuing to empower people and businesses with AI at scale. The collaboration with Nvidia sits alongside OpenAI’s broader ecosystem of collaborators, including Microsoft, Oracle, SoftBank and Stargate partners that are collectively pursuing the world’s most advanced AI infrastructure.
The deal comes on the heels of broader moves in the AI landscape. Nvidia’s planned investment is paired with other high-profile alignments around OpenAI and its technology, including Microsoft’s ongoing involvement and a separate, non-binding deal that would give Microsoft a substantial equity stake in OpenAI’s for-profit entity. Nvidia’s involvement also follows the company’s recent commitment of $5 billion to chipmaker Intel, signaling a broader push to shore up supply and capabilities in a market where demand for AI chips outstrips supply.
Analysts and industry observers have warned that the scale of capital flowing into OpenAI and related AI infrastructure raises potential antitrust and regulatory questions, even as the Biden administration and U.S. agencies have signaled a willingness to scrutinize dominant players in the AI space. In June, the Justice Department and the Federal Trade Commission signaled that they would closely examine growing concentration among a handful of companies—Microsoft, OpenAI, Nvidia, and others—in the AI ecosystem. Regulators have also faced concerns raised by state attorneys general about safety and consumer protection around AI products such as ChatGPT, including cases involving minors. OpenAI has argued that its nonprofit governance structure remains central to its mission, even as its for-profit subsidiary develops and sells its AI products.
The financing and structure of the Nvidia-OpenAI agreement emphasize a model in which OpenAI gains access to chips and related infrastructure while Nvidia secures a long-term foothold in one of the world’s most prominent AI platforms. The two sides described the LOI as a pathway to accelerate the deployment of Nvidia systems across OpenAI’s AI stack, including the infrastructure needed to train and run large language models and other AI workloads. The deal underscores how hardware providers and AI developers are aligning more tightly as the demand for scale accelerates and the cost of compute remains a primary constraint on innovation.
For Nvidia, the investment deepens a strategic relationship with OpenAI that has already included prior funding rounds and ongoing collaboration on AI infrastructure. Nvidia’s stock moved higher on the news, reflecting investor expectations that the partnership could help secure a steady demand stream for Nvidia’s data-center chips in a crowded, high-stakes market. The collaboration also interacts with existing agreements around data-center capacity and cloud partnerships involving OpenAI, Microsoft and other technology partners, all of which are pursuing large, multi-year expansions of AI infrastructure to support faster, more capable AI systems.
If completed as envisioned, the LOI would bring together two of the most influential players in AI infrastructure and could centralize OpenAI’s compute needs around Nvidia’s hardware stack for years. The partnership would be one of the most consequential in the AI arms race, potentially shaping the deployment of AI services across enterprise and consumer markets while setting benchmarks for cooling, power efficiency, and chip utilization at massive scales. Industry observers will watch closely how the financial terms, governance rights, and performance milestones are negotiated, as well as how regulators review the implications for competition in a market that already includes major stakes by Microsoft, Oracle, SoftBank and others.
Image:
As the deal advances, the parties will need to resolve how the investment intersects with OpenAI’s nonprofit roots and its for-profit subsidiary, a structure that has already drawn scrutiny and legal challenges from figures who helped establish the nonprofit entity. The evolving governance arrangement, including any influence leveraged by Nvidia as a strategic investor, will be watched for implications on openness, safety, and accountability in AI development.
The broader AI ecosystem, including long-standing collaborations and the emergence of new data-center ecosystems like Stargate, continues to hinge on the scale and reliability of compute resources. The partnership between Nvidia and OpenAI, paired with parallel investments and partnerships involving Microsoft and other participants, signals that the next phase of AI progress will depend as much on hardware access and operational scale as on software breakthroughs. As the industry advances, regulators and policymakers are likely to weigh how such deepening ties influence innovation, consumer protection, and competition in the AI economy, even as consumers look to benefit from faster, more capable AI products and services.