3 acciones relacionadas con la infraestructura de IA que se deben comprar en enero: Las selecciones del estratega especializado en tecnología profunda

Generado por agente de IAEli GrantRevisado porAInvest News Editorial Team
sábado, 10 de enero de 2026, 12:55 pm ET5 min de lectura

The most compelling investments in AI are not in the applications that run on it, but in the fundamental infrastructure that makes it possible. This is a paradigm shift of historic scale, moving from a world of software to one where compute, storage, and power are the new industrial resources. The market itself signals this transformation. It is projected to grow from

, a multi-decade expansion that demands a complete overhaul of the technological rails.

This isn't just theoretical growth; it's already driving unprecedented capital expenditure. In the first half of 2025,

, outpacing the U.S. consumer as an engine of expansion. This massive buildout is creating acute bottlenecks at every layer. The demand for specialized chips is straining semiconductor manufacturing. The need for vast, efficient storage is pushing providers to innovate beyond traditional solutions. And the sheer power required by AI workloads is exposing the limitations of an aging grid.

These bottlenecks are where outsized value is captured. As AI moves from pilot projects to production-scale deployment, the companies that provide the essential components-whether it's the compute engines, the high-density storage arrays, or the power solutions that keep data centers humming-will be the foundational players of the next era. The exponential adoption curve is being built on these infrastructure layers, making them the most strategic bets for the long-term.

Nvidia (NVDA): The Compute Powerhouse at the S-Curve's Peak

Nvidia sits at the very peak of the AI adoption S-curve, and its valuation still reflects a bargain for a company growing at this scale. The stock trades at a

and a price/earnings-to-growth (PEG) ratio of less than 0.7. For a company that grew revenue by 62% last quarter, those multiples are considered undervalued. This disconnect is the opportunity: the market is pricing as a high-growth story, but not yet as the indispensable infrastructure layer it has become.

That layer is built on a foundation of industry standards and a full-stack strategy that creates a formidable barrier to entry. Nvidia's GPUs are the

, but the company's strength is vertical integration. It complements its chips with CPUs, networking, and a vast ecosystem of software tools like CUDA. This allows for optimization across the entire data center stack, giving its systems a lower total cost of ownership. Competitors can build cheaper custom accelerators, but they lack the pre-built software moat that Nvidia has cultivated. As one analyst noted, while tech giants will seek second sources, these efforts will, at best, only chip away at Nvidia's AI dominance.

The growth story is directly tied to the accelerating rate of AI adoption, which is faster than any previous technological inflection point. The demand is so intense that AI-related capital expenditures contributed 1.1% to GDP growth in the first half of 2025. Nvidia is the primary beneficiary, and its trajectory is set by this macro buildout. With leading foundry

projecting AI chip demand will grow at a mid-40% compound annual growth rate over the next few years, Nvidia is positioned to at least keep pace. The company's forward P/E and PEG ratios suggest the market has not fully priced in the exponential nature of this adoption curve. For an investor betting on the infrastructure of the next paradigm, Nvidia remains the most direct play on the compute power that is the fuel for it all.

TSMC (TSM): The Foundry Enabling the Next Compute S-Curve

TSMC is the indispensable manufacturing layer for the AI infrastructure buildout, and its role is becoming a bottleneck that drives its own exponential growth. The company is the sole manufacturer of Nvidia's most advanced AI chips, a position that places it directly in the path of the entire paradigm shift. With chip designers racing to pack more transistors into smaller areas, TSMC's advanced nodes are the only viable path. Its

is a staggering concentration of power, especially as rivals struggle with yields. This near-monopoly allows TSMC to not only expand capacity but also to raise prices, a dynamic that is already materializing.

The pricing power is a key lever for earnings. Reports indicate TSMC is securing a

. This is more than a simple markup; it's a direct capture of the value created by the AI compute demand it enables. As the market for AI chips is projected to grow to $250 billion to $300 billion this year, TSMC's ability to command higher prices on its most critical output will accelerate its own revenue and profit growth. This pricing strength, combined with full-capacity operations, suggests earnings will grow at a faster pace than Wall Street's current estimates.

Analysts are already adjusting their views. Goldman Sachs recently hiked its price target on TSMC, pointing to AI as a multi-year growth driver. The company's own forward P/E of under 20 times 2026 earnings estimates, with a PEG well below 1, suggests the market is still pricing it as a high-growth story rather than a foundational monopoly. For an investor focused on the infrastructure of the next compute S-curve, TSMC represents a bet on the physical layer where exponential adoption meets manufacturing reality. Its ability to raise prices while expanding capacity makes it a direct beneficiary of the AI buildout, not just a passive supplier.

Pure Storage (PSTG): The Storage Layer for the AI Data Deluge

While compute and power are the obvious bottlenecks, the AI infrastructure buildout is also generating a data deluge that demands a fundamental upgrade in storage. The market for all-flash arrays, the high-performance systems needed for AI workloads, is projected to grow at

. This isn't just incremental growth; it's a structural shift as enterprises move from traditional spinning disks to flash for the speed and reliability AI models require.

Pure Storage is positioned as a leader in this transition, not just by selling more capacity, but by solving the critical bottlenecks of density and power. Its DirectFlash technology manages raw flash memory at the array level, which the company claims delivers two to three times the storage density and consumes from 39% to 54% fewer watts per terabyte than its closest competitors. For data centers already straining under the heat and power draw of AI servers, this efficiency is a make-or-break advantage. It allows for more storage in the same footprint and at a lower operational cost, directly addressing a key friction point in scaling AI infrastructure.

The financial story is one of accelerating execution. Pure Storage's adjusted earnings grew 16% in the third quarter, a solid performance. More importantly, Wall Street is looking past that single quarter and sees a clear acceleration, with analysts anticipating earnings growth to ramp to 23% annually through the coming years. This forecast implies the company is not just keeping pace with the AI boom but is gaining market share within it, likely capturing value as enterprises upgrade their storage layers.

In the broader AI paradigm,

represents the essential data layer. Just as Nvidia provides the compute and TSMC manufactures the chips, Pure Storage provides the high-speed, efficient storage that feeds the AI engines. Its focus on density and power efficiency aligns perfectly with the physical constraints of modern data centers. For an investor betting on the exponential growth of AI, Pure Storage offers a pure-play bet on the infrastructure that will manage the data explosion, with a growth trajectory that is already gaining momentum.

Valuation, Catalysts, and Risks: The Deep Tech Perspective

The investment thesis for AI infrastructure is clear: bet on the exponential adoption curve by owning the companies building the fundamental rails. But translating that into forward-looking decisions requires focusing on catalysts, risks, and the right kind of growth. The goal is not to chase high multiples, but to identify companies with strong earnings growth and pricing power in bottleneck markets.

The key catalysts are announcements that signal the buildout is accelerating. For infrastructure providers, this means news of new data center capacity coming online, especially with modular construction and digital innovation enabling faster delivery

. Breakthroughs in power density or cooling efficiency are equally critical, as they directly address the physical constraints of scaling AI. And major customer wins-like a cloud giant signing a multi-year lease for a new facility-provide concrete validation of demand and secure future revenue streams. The market for AI chips is projected to reach $250 billion to $300 billion this year, and companies that can capture a growing share of that spend will see their earnings accelerate.

The primary risk, however, is execution. The AI adoption curve is steep, but building the necessary physical infrastructure at the required scale and speed is a monumental task. Supply chain pressures, power constraints, and talent shortages are real hurdles that can strain timelines and costs as accelerated schedules drive up costs and strain supply chains. A company with a strong order book is only as good as its ability to deliver. This is why pricing power is so valuable. It provides a margin buffer against these frictions and signals a dominant position where customers have little choice but to pay for the essential capacity.

From a valuation standpoint, the focus should be on growth potential, not just current multiples. The evidence suggests that even leaders like Nvidia and TSMC trade at forward P/Es under 25 and 20, respectively, with PEG ratios below 1

. This is the deep tech investor's sweet spot: a company whose earnings are set to grow at a pace that justifies its valuation, driven by its position in a bottleneck market. The investment is not in a stock that has already doubled, but in a business that is just beginning to scale to meet an exponential demand curve.

author avatar
Eli Grant

Comentarios



Add a public comment...
Sin comentarios

Aún no hay comentarios