Two Rails for the AI S-Curve: AMD and Nvidia in 2026

Generado por agente de IAEli GrantRevisado porAInvest News Editorial Team
miércoles, 7 de enero de 2026, 11:50 am ET4 min de lectura
AMD--
NVDA--

The AI boom is hitting a critical inflection point. The explosive growth of the past few years, driven by the massive computational demands of training giant models, is giving way to a new phase. The workloads are shifting. In 2026, the focus will pivot from training to inference-the process of using a trained model to answer questions, generate text, or analyze data in real time. This isn't a slowdown; it's a paradigm shift that will redefine the infrastructure stack.

The numbers tell the story of this transition. Inference is projected to account for roughly two-thirds of all AI compute by 2026, up from a third in 2023 and half in 2025. This surge in inference demand is creating a massive new market. The market for inference-optimized chips is expected to grow to over US$50 billion in 2026. The acceleration is global and relentless, with CEOs like AMD's Lisa Su citing 'AI everywhere' as a primary driver for computing power. The demand is no longer niche; it's becoming embedded in every enterprise and consumer device.

This shift creates a dual-track opportunity for infrastructure. While inference will dominate the total compute load, the Deloitte analysis notes that a majority of the computations will still be performed on cutting-edge, expensive, power-hungry AI chips worth over $200 billion. These will remain central to large data centers and enterprise solutions. Yet, the nature of the workload changes. Inference tasks are less computationally intensive than training, opening the door for specialized, cost-optimized chips that could be deployed on edge devices. The bottom line is that the exponential growth in demand is accelerating, with compute needs projected to rise at four to five times per year out to 2030. The infrastructure rails for this next phase-both the high-end central systems and the distributed edge chips-are now being built.

AMD: Building the Inference Infrastructure Layer

The shift to inference is creating a new infrastructure layer, and AMDAMD-- is positioning itself as a key builder. CEO Lisa Su has declared that demand for AI computing is "going through the roof," driven by the spread of AI across every industry. This isn't just incremental growth; it's the foundational demand for the next phase of the S-curve. AMD's strategy is to capture a major share of the inference-optimized chip market, which is projected to grow to over US$50 billion in 2026. The company is building the rails for this distributed compute, moving beyond just the central data center.

Financially, AMD is executing with accelerating momentum. Its stock, however, has lagged the broader tech rally, trading at a discount. This presents a potential entry point for investors betting on the infrastructure build-out. The company is scaling its production and bundling its high-end chips into massive, efficient systems-Su noted one platform uses seventy-two of these high-cost chips to deliver the performance needed for enterprise AI. This focus on system-level integration, not just individual chips, is critical for capturing the total cost of ownership advantage in inference workloads.

The bottom line is that AMD is constructing the physical layer for the inference paradigm. By targeting the massive, growing market for inference chips and demonstrating strong financial execution, the company is building the fundamental infrastructure that will support the "AI everywhere" vision. Its current valuation may not fully reflect the exponential adoption curve it is helping to create.

Nvidia: The Compute Leadership Layer in China

Nvidia's dominance in the AI compute stack is its most durable moat. Its chips are the fundamental building blocks for developing and running the world's most advanced AI models. This leadership position is now set to be turbocharged by the potential reopening of China, its largest single market. The catalyst is clear: CEO Jensen Huang stated Tuesday that the company is seeing "very high" customer demand in China for its H200 AI chips, with the U.S. government signaling approval for export.

The financial implications are massive. Huang has previously estimated that the Chinese market could be worth $50 billion per year. Crucially, none of those sales are currently included in Nvidia's forecasts. This means the company's recent two-year revenue guidance of $500 billion does not account for this potential windfall. If licenses are finalized and Chinese import approvals follow, NvidiaNVDA-- could see billions in additional revenue flow in 2026 and beyond.

The setup is a classic S-curve inflection. Nvidia has already fired up its supply chain, with H200s flowing through the line again. The demand signal is strong, and the company is ready to ship. While the H200 is a generation behind the latest models, its performance and lack of intentional throttling make it a critical tool for Chinese firms building their own AI. The bottom line is that the reopening of China is not a minor market expansion; it is a major catalyst that could add a full decade's worth of growth to Nvidia's trajectory, accelerating its path along the exponential compute adoption curve.

Valuation and Catalysts: The Buildout's Financial Impact

The financial impact of the AI buildout is now being felt, but the market is sorting the winners from the mere spenders. The consensus estimate for 2026 capital expenditure by AI hyperscalers has climbed to $527 billion, a massive commitment that underscores the scale of the infrastructure push. Yet, investors have rotated away from AI infrastructure companies where growth in operating earnings is under pressure and capex spending is debt-funded. This selective rotation shows the market is moving beyond simple exposure to AI spending. The average stock in the infrastructure group returned 44% year-to-date, but that gain is not keeping pace with the underlying earnings growth, creating a valuation disconnect.

The next phase of the AI trade is expected to involve AI platform stocks and productivity beneficiaries, not just the builders of the physical layer. Goldman Sachs Research notes that attention is starting to shift to companies with the potential for AI-enabled revenues. This means the financial impact will be felt more broadly, but the near-term catalysts remain tied to the infrastructure execution and market expansion for the leaders. For AMD and Nvidia, the key catalyst is the potential reopening of the Chinese market for advanced AI chips. Nvidia CEO Jensen Huang stated Tuesday that the company is seeing "very high" customer demand in China for its H200 AI chips, with the U.S. government signaling approval for export. This could add billions to revenue forecasts, as Huang has previously estimated the market could be worth $50 billion per year and none of those sales are currently included in Nvidia's guidance.

The bottom line is that the exponential adoption thesis is now being tested against financial reality. The buildout is real and massive, but the market is demanding proof that capex translates to durable earnings growth. For AMD and Nvidia, the path forward involves two tracks: demonstrating that their infrastructure is essential to the inference paradigm shift, and capturing the next wave of demand, like the potential Chinese market. The financial impact will be significant, but the shift in investor focus means that only those companies showing a clear link between their spending and future revenues will be rewarded.

author avatar
Eli Grant

Comentarios



Add a public comment...
Sin comentarios

Aún no hay comentarios