Photo via Media Post

***

Advanced Micro Devices Inc. (AMD), the semiconductor company known for producing Central Processing Units (CPUs) and Graphics Processing Units (GPUs), has officially joined the AI arms race. 

Its chips—no, not the kind you dip in guac—power a wide variety of tech. In simplest terms, the CPU acts as the brain, storing data and instructions. Meanwhile, the GPU crunches thousands of calculations simultaneously. Together, they keep the tech world humming and, ultimately, move daily digital life forward.

Now, one of AI’s most prominent players wants in. OpenAI, the American AI research organization most famous for creating ChatGPT, continues to expand its global ambit. However, most of its AI models are currently focused on solving the world’s most pressing issue: how to write better email subject lines. 

Nonetheless, their expansion demands a greater supply of AI chips, as these models process vast datasets and calculations. Sam Altman, CEO of OpenAI, has been rather vocal about the struggles in procuring enough computing power to meet the demands of modern AI systems. 

Fortunately, sourcing the chips is no longer a quandary. On October 6th, 2025, AMD and OpenAI announced a multibillion-dollar partnership that positions AMD squarely in the nascent AI race, directly in the rearview mirror of its primary semiconducting competitor, Nvidia.

Nvidia has long dominated the AI chip market, spearheading it through its GPU architectures—the blueprints by which AI chips handle workloads. Not to mention, their proprietary AI software stacks, or layered software tools that govern how these chips operate. By and large, they line today’s machine learning infrastructure. 

Initially focusing their chips on video gaming, they expanded into the industry and by the first quarter of 2025, held a 92% share of the desktop and laptop GPU market. However, this landmark collaboration between OpenAI and AMD will challenge their dominance in the field. 

While it is too early to call this a sustained shift in leadership, the change is patently notable. AMD’s instruction sets—the basic commands its chips understand—and system designs suggest an open, modular edge to software compared to Nvidia’s closed, strict designs. Nvidia’s “closed design” refers to its tightly controlled system, which is highly efficient insofar as it works within its own platforms and licenses. Whereas AMD bets on openness, allowing developers to tinker and optimize performance with particular tools. This flexibility is becoming increasingly desirable as AI tasks become more complicated. 

In fact, researchers argue that AMD’s open software stack is more useful in research: 

“We argue that an open software stack… may be able to provide much-needed flexibility and reproducibility in the context of real-time GPU research… In support of this claim, we summarize how closed-source platforms have obstructed prior research using NVIDIA GPUs, and then demonstrate that AMD may be a viable alternative….”

This sentiment arises in various white papers—formal, peer-reviewed reports about complex issues—and industry research which highlight the advantages of AMD’s blistering AI performance capabilities. The appetite for computing power continues to surge, and research has long highlighted that AMD’s integrated chip ecosystems deliver a more cost-efficient Total Cost of Ownership for AI than Nvidia’s.

Photo via MarketWatch

***

With this new deal, AMD’s infrastructure can now provide OpenAI the inference optimization it needs. The company projects to supply hundreds of thousands of advanced chips to OpenAI, signaling a massive vote of confidence in AMD’s fast-growing role in the tech world. As the industry shifts, this transition represents a major opportunity for AMD to monetize on its architectural efficiency and competitive pricing. 

As it stands, AMD issued a warrant granting OpenAI the right to purchase up to 160 million AMD shares—approximately a 10% stake in the company—at a nominal price of one cent per share. This partnership entrenches OpenAI into AMD’s growth trajectory. 

The deployment expects to deliver up to six gigawatts of computing power, with the first one-gigawatt rollout scheduled for the second half of 2026. Analysts project the collaboration could generate tens of billions of dollars in annual revenue, with total new revenue exceeding $100 billion over the next four years. This rollout positions AMD to strengthen its AI software stack atop its maturing chips. 

On Wednesday, October 8, 2025, AMD’s stock closed at $235.56, an 11.33% increase from the previous day’s close. It marks the company’s best single-day performance since April 2016. Skeptics warn of a speculative tech bubble. Nonetheless, this gambit is nothing short of impressive—especially amid the surge of convoluted AI-linked corporate investments.

Rather than swinging for the fences, AMD is laying the groundwork for a longer game. By reopening the question of what compute infrastructure should be, these tech giants are changing the terms of the AI conversation.

***

This article was edited by Annika Trippel and Dysen Morrell.

Related Post

Leave a Reply