Building AMD into a credible alternative to Nvidia for AI workloads at hyperscaler scale is something Lisa Su has spent the last few years doing in Santa Clara, in the kind of glass-and-steel office park that the semiconductor industry has built its identity around. The majority of the analyst community thought this was premature or unlikely. The discussion was initiated by the MI300X chips. It gained structural weight in 2025 thanks to the OpenAI cooperation. And now, AMD has entered a market that is changing the competitive landscape of the AI chip market in ways that are still being fully processed thanks to a deal with Meta Platforms worth up to $60 billion over five years, which is structured with an equity arrangement that Jensen Huang publicly called “clever” while attempting to make it sound like a sign of weakness.
The deal’s public headline is sufficiently impressive: starting in late 2026, AMD will provide Meta with customized Instinct MI450-series GPUs and sixth-generation EPYC Venice CPUs, providing 6 gigawatts of AI computing power to meet Meta’s infrastructure requirements for Llama 4 development and the AI assistant products being built on top of it. Five years. Revenue is assured. Nvidia’s limited manufacturing schedules have made it challenging to rely on the kind of supply assurance that Meta’s AI aspirations demand. That is the arrangement’s visual structure, and it is important enough on its own. The equity component underpinning the deal is what makes it structurally unique and has been the subject of debate in the areas of the market where deal design is examined rather than just reported.
| Category | Details |
|---|---|
| Topic | AMD–Meta $60 Billion AI Chip Partnership (2026) |
| AMD CEO | Lisa Su |
| Nvidia CEO | Jensen Huang |
| Deal Value | Up to $60 Billion over 5 years |
| AMD Chips Supplied | Custom Instinct MI450-series GPUs + 6th Gen EPYC “Venice” CPUs |
| Computing Capacity | 6 Gigawatts of AI compute |
| Equity Incentive | 160 million share warrants (~10% AMD stake) at $0.01/share |
| Vesting Conditions | Performance milestones + share price targets (up to $600/share) |
| Similar Prior Deal | AMD–OpenAI equity-sharing arrangement (2025) |
| Meta’s AI Models | Llama 4 and future AI assistants |
| Nvidia’s Response | Multi-generational Blackwell/Rubin deals with Meta and others |
| Reference Website | amd.com |
AMD offered warrants that permitted Meta to buy up to 160 million AMD shares at a nominal price of $0.01 each, or roughly 10% of the company. Not all of the warrants vest at once. They vest in phases based on share price targets of up to $600 per share and technical milestones. The vesting structure aligns Meta’s financial interests with AMD’s stock performance in a specific way: as AMD executes on the technical milestones and as the chip deliveries prove out in Meta’s data centers, Meta’s equity stake becomes more valuable, which means Meta has a direct financial incentive to want AMD to succeed broadly — not just as a chip supplier but as a company. The arrangement’s circular shape is intentional. Compared to a client who can easily switch vendors when a better offer comes in, a customer who has a significant stake in their provider has a different connection with them.
Huang’s public remarks regarding AMD’s equity deals, characterizing them as “clever” and implying that offering 10% of a company before the MI450 is fully established in the market reveals something about how desperate competitors are, landed with the practiced nonchalance of someone managing a reaction rather than expressing a genuine assessment. Regardless of what the hardware comparison reveals, the Blackwell and forthcoming Rubin architectures continue to be more powerful by most benchmarks, and AMD is unable to swiftly replace CUDA’s software environment.
These benefits are genuine. However, the argument that AMD is catching up ignores the impact that a $60 billion committed revenue stream over a five-year period has on a company’s manufacturing investment capacity, product development roadmap, and the confidence of all other hyperscalers who are watching the Meta decision and reevaluating their own vendor concentration.
Although the performance-per-watt profile of the bespoke MI450 for inference workloads was a real consideration, Meta selected AMD for reasons other than hardware specs. The other factor was supply security, which was challenging to overlook due to Nvidia’s output limitations.
The appeal of a structured alternative with assured capacity rises proportionately when the world’s largest consumer of AI infrastructure makes five-year planning commitments and is unable to consistently receive delivery confirmation from its principal provider. A precedent for the equity-sharing approach was set by the 2025 OpenAI contract. The Meta pact expanded on that precedent, making it a structural strategy rather than a one-time agreement that the market must take seriously.
It’s difficult to ignore the fact that the competitive environment Nvidia has been operating in for the past few years—near-total demand, customers accepting whatever pricing and timeline the supply allowed, and no reliable alternative for the most demanding AI workloads—has changed in a particular and documented way.
Nvidia is not being abandoned by Meta. Alongside the AMD agreement, it has upheld its own multi-generational partnership promise for Blackwell and Rubin chips, which is precisely the multi-vendor approach that any procurement team managing infrastructure of Meta’s scale ought to be following. However, there is a half-life to the leverage that comes with being the only viable alternative, and the Meta-AMD deal shows that Nvidia’s monopoly position has outlived its half-life.
