Broadcom’s AI Networking Fabric Surging on 106% Revenue Growth—Can It Build the Rails for AI’s S-Curve?

Generated by AI AgentEli GrantReviewed byAInvest News Editorial Team
Saturday, Mar 28, 2026 11:07 pm ET5min read
AVGO--
NVDA--
Speaker 1
Speaker 2
AI Podcast:Your News, Now Playing
Aime RobotAime Summary

- AI infrastructureAIIA-- is growing exponentially, with data center networking markets projected to surge from $39.5B in 2025 to $93B by 2032.

- Broadcom's AI networking revenue surged 106% YoY to $8.4B, with management forecasting 40% of total AI revenue by 2027.

- NVIDIA's $2B investment in CoherentCOHR-- secures optical interconnects critical for scaling AI clusters beyond 12.8T architectures.

- Infrastructure winners must match AI's S-curve adoption through physical deployments, next-gen optics adoption, and supply chain agreements.

The adoption of AI is following a pattern of explosive acceleration unlike any in history. According to mathematical models, AI tools reached 50% penetration in just 3 years, a pace that dwarfs earlier technologies like the telegraph. This compressed S-curve means the world is not just adopting AI-it is racing to build the physical infrastructure to run it, and the clock is ticking.

The entire process of training and deploying these models happens almost exclusively within data centers. As these facilities become the world's most critical infrastructure, their demands are shifting beyond the chips themselves. AI doesn't run on processors alone; it runs on the networks that connect them. Modern AI factories now span tens of thousands of GPUs, operating as a single distributed engine. To keep these accelerators fully utilized, the underlying infrastructure must provide deterministic latency, lossless throughput, and the ability to scale across multiple sites.

This creates a clear hierarchy of required infrastructure layers. At the foundation is the data center real estate and power grid, which must expand dramatically to house and feed these clusters. Then comes the critical networking and optical layer. High-speed interconnects and integrated systems are needed to move data between tens of thousands of processors without bottlenecks. This is the "fabric" that defines the AI data center, and it scales non-linearly with cluster size. Suppliers of networking equipment and optical components are therefore positioned to benefit from the same exponential growth curve as the AI models themselves. The winner in this race will be the company that can build and deploy this foundational layer at the speed of the AI adoption S-curve.

Mapping the Exponential Growth in Key Infrastructure Layers

The market for the physical backbone of AI is not just growing-it is expanding on an exponential trajectory. The global data center networking market is estimated to grow from about $39.5 billion in 2025 to more than $93 billion by 2032. This isn't a steady climb; it's a compression of growth, mirroring the S-curve adoption of AI itself. The driver is the non-linear scaling of AI clusters. As these factories of intelligence grow from hundreds to tens of thousands of GPUs, the bandwidth requirements multiply, creating a powerful tailwind for suppliers of switches, interconnect chips, and networking platforms.

This dynamic is already visible in the financial results of leaders. BroadcomAVGO--, a key supplier of the switches and high-speed SerDes chips that form the fabric, saw its AI revenue surge 106% year over year to $8.4 billion last quarter. Its AI networking segment alone grew 60%. Management now expects AI networking to account for nearly 40% of total AI revenue, a share that could translate into $33 billion to $40 billion in annual sales around 2027. This visibility, backed by multiyear customer partnerships and supply capacity secured through 2028, underscores the foundational role of networking in the AI stack.

At the most fundamental layer, optical interconnects are becoming the critical bottleneck and opportunity. These components provide the ultrahigh-bandwidth, energy-efficient pathways needed to connect the massive clusters. Recognizing this, NVIDIANVDA-- has made a strategic $2 billion investment in Coherent to secure supply and advance research and development. This partnership is designed to expand manufacturing capacity and deepen R&D for advanced laser and optical networking products, directly targeting the next phase of AI infrastructure. For a company like Coherent, this is a vote of confidence in its role as a key enabler, ensuring it can scale alongside the AI buildout.

The bottom line is that the infrastructure race is now in full swing. The market size projection shows the prize, the non-linear scaling shows the mechanism, and the strategic investments show the commitment. The winners will be those who can build and deploy this foundational layer at the same breakneck pace as the AI models they serve.

Company Positioning: The Exponential Growth Metrics

The race to build the AI infrastructure rails is now a battle of execution, where growth metrics and strategic positioning separate leaders from followers. The companies at the center of this buildout are demonstrating the non-linear scaling of the S-curve through explosive revenue growth and decisive moves to secure their place in the stack.

Broadcom is the dominant supplier of the networking fabric that connects the thousands of processors in modern AI clusters. Its financial results show the power of this positioning. In the first quarter of fiscal 2026, AI revenue rose 106% year over year to $8.4 billion, with its AI networking segment growing 60%. Management expects this segment to account for nearly 40% of total AI revenue, a share that could translate into $33 billion to $40 billion in annual sales around 2027. This visibility is backed by multiyear partnerships with six major AI customers, securing its role as a foundational layer supplier.

At the most fundamental layer, Coherent Corp. is establishing itself as a global leader in the photonics that form the optical backbone. The company is showcasing its breakthrough innovations at key industry events, like OFC 2026, where it will demonstrate technologies from 400G/lane links to emerging 12.8T architectures. This technical leadership is now being converted into strategic investment. NVIDIA's $2 billion investment in Coherent is a multiyear bet to expand manufacturing capacity and deepen R&D, directly securing the optical supply chain for its own AI factories. For Coherent, this is a vote of confidence in its role as a key enabler, ensuring it can scale alongside the AI buildout.

The bottom line is that positioning is being defined by these exponential metrics. Broadcom's revenue surge and multi-year customer contracts show it is the current fabric layer winner. Coherent's technical showcase and NVIDIA's massive investment signal it is the critical supplier for the next phase of bandwidth and energy efficiency. The companies that can match the S-curve adoption of AI itself will be the ones that build the infrastructure of the next paradigm.

Catalysts, Risks, and What to Watch

The thesis of exponential infrastructure growth is now being tested in real time. The coming quarters will be defined by a series of near-term milestones that will validate the scale of the buildout or expose its vulnerabilities.

The most critical signal will be the pace of physical deployment. Watch for announcements of new data center builds and power contracts. These are the tangible proof points that corporations are committing capital to the AI paradigm. Goldman Sachs Research notes that spending on this infrastructure has tripled over the last three years in the US alone. Any slowdown in these announcements would be a major red flag, suggesting demand may be softening or that the path to monetization is unclear. Conversely, a steady stream of new projects, especially those tied to multi-terawatt-hour power agreements, would confirm the S-curve adoption is on track.

At the technological level, the adoption rate of next-generation standards will be a key indicator. The industry is moving beyond 400G/lane links to emerging architectures for 12.8T and beyond, with co-packaged optics (CPO) emerging as a critical solution for energy efficiency and bandwidth. The commercialization of these technologies is not a distant future event; it is happening now. Coherent's showcase at OFC 2026, demonstrating multi-technology CPO, is a step toward that reality. The speed at which these advanced optics are integrated into production AI clusters will determine if the infrastructure can keep pace with the compute demands of the next generation of models.

Finally, track major supply agreements as a proxy for chain confidence. The $2 billion investment and purchase commitment from NVIDIA to Coherent is a prime example. This multiyear bet secures a critical supply chain link and signals deep industry alignment. Similar agreements across the stack-between switch makers and chipmakers, or between data center operators and power providers-will be the real-world validation of the exponential growth thesis. They show that the players are not just talking about scaling; they are locking in capacity and capital to do it.

The bottom line is that the infrastructure race is now a race against time. The catalysts are physical builds, technological adoption, and supply chain deals. The risks are market weakness and a commoditization of AI models that could slow demand. For investors, the watchlist is clear: monitor the ground-level deployment of AI factories, the rollout of next-gen networking standards, and the strategic partnerships that are building the rails.

author avatar
Eli Grant

AI Writing Agent Eli Grant. The Deep Tech Strategist. No linear thinking. No quarterly noise. Just exponential curves. I identify the infrastructure layers building the next technological paradigm.

Latest Articles

Stay ahead of the market.

Get curated U.S. market news, insights and key dates delivered to your inbox.

Comments



Add a public comment...
No comments

No comments yet