
To most users, today’s AI chatbots feel increasingly interchangeable. Whether you’re using one platform or another, the experience often seems similar, suggesting the software layer of AI is drifting toward commoditization.
But beneath the surface, the hardware powering that intelligence tells a very different story.
AMD cuts deals. Nvidia sets the terms.
Advanced Micro Devices $AMD ( ▼ 1.72% ) , the clear No. 2 player in AI GPUs, has struck major supply agreements that included giving customers potential equity exposure. Deals with companies like OpenAI and Meta $META ( ▲ 1.46% ) involved unusually generous terms to secure massive chip purchases.
By contrast, Nvidia $NVDA ( ▲ 3.08% ) doesn’t appear to need such incentives. Its multi-year partnership with Meta involves the sale of millions of GPUs without handing over ownership stakes or similar concessions.
That difference highlights the gap in bargaining power between the two companies. AMD is competing for share. Nvidia is defining the market.
CUDA is the real moat
Nvidia’s dominance isn’t just about raw hardware performance. Its chips come bundled with CUDA, a deeply entrenched software ecosystem that developers have built around for years.
Switching away from that environment isn’t trivial. It requires retraining teams, rewriting code, and re-optimizing entire workflows. That stickiness allows Nvidia to charge premium prices and maintain strong demand even as competitors improve.
In other words, Nvidia doesn’t just sell chips. It sells a platform.
AI hardware is more like energy than software
A useful analogy comes from the energy sector. Consumers don’t care whether electricity comes from solar panels, hydropower, or natural gas as long as the lights turn on. Similarly, end users may not notice which chatbot engine is responding.
But upstream, the inputs matter enormously. Data centers packed with GPUs consume vast amounts of power and capital, and not all hardware options are interchangeable from an operator’s perspective.
That’s why, even as AI applications become ubiquitous and similar on the surface, the battle for the underlying compute infrastructure remains fiercely competitive and highly profitable for the leaders.