AInvest Newsletter
Daily stocks & crypto headlines, free to your inbox
The AI infrastructure story is entering a new phase. The explosive growth of 2023 and 2024, driven by massive model training, is now giving way to a different kind of compute demand. The core workload is shifting from training to inference-the process of using a trained model to answer questions, generate content, or analyze data in real time. This isn't a slowdown; it's a paradigm shift that defines the next leg of the technological S-curve.
By 2026, inference is projected to account for roughly two-thirds of all AI compute, up from a third in 2023 and half in 2025. This surge in usage is creating a new infrastructure imperative. Enterprises are discovering that their existing cloud or on-premises setups are misaligned with the unique demands of AI's recurring, low-latency workloads. The economics are clear: while inference costs have plummeted, the sheer volume of queries-especially from agentic AI-has spiraled spending, with some monthly bills reaching tens of millions of dollars. This is the wake-up call that forces a rethink.

The market is responding with specialized solutions. The demand for inference-optimized chips alone is forecast to grow to over US$50 billion in 2026. Yet, as Deloitte notes, this doesn't mean a retreat from large-scale compute. The overall market for cutting-edge AI chips is expected to be worth over US$200 billion, still primarily housed in purpose-built data centers. The shift is about efficiency and placement, not scale reduction. It demands new infrastructure layers optimized for continuous operation, moving beyond traditional models.
This transition is also a major economic engine. AI spending is projected to grow at a 31% annual rate through 2033 and has already become a key driver of U.S. economic growth, accounting for over a third of it in the first three quarters of 2025. The entry into this inference-dominated phase is where exponential growth opportunities crystallize. For investors, the setup is about identifying the providers building the fundamental rails for this new compute paradigm-whether it's a neocloud leader like
or a platform enabler like Atlassian, both positioned at the inflection point.CoreWeave is building the fundamental rails for the inference S-curve. As a leader in purpose-built AI data centers, the company operates in a market where demand for its specialized capacity far exceeds supply. This isn't a speculative bet; it's a direct play on the infrastructure layer required for the paradigm shift from training to inference. The company's revenue is expected to
, a trajectory that signals strong adoption of its inference-optimized capacity and validates its position as a neocloud essential.The setup is clear. CoreWeave's data centers are engineered for the extreme power density and cooling demands of AI workloads, a niche where it has been scored as the most capable provider by analysts, even above tech giants. This technical edge has fueled explosive growth, with revenue jumping 134% year over year to $1.36 billion last quarter. The company's backlog of nearly $56 billion at the end of 2025 demonstrates that this isn't just a short-term spike but a long-term commitment to capacity that customers are locking in. The market is projected to expand at a 31% annual rate, and CoreWeave is positioned to capture a significant share of the inference-driven segment.
Yet, the biggest near-term constraint for all infrastructure providers is the same: power. The surge in AI's power demand is colliding with energy constraints, creating a fundamental bottleneck. CoreWeave's plan to bring online at least 1 gigawatt of active capacity in the next year or two is ambitious, but its success hinges on securing reliable, low-cost power sources. This is the critical friction point on the S-curve-scaling the physical rails requires overcoming the energy supply chain. For now, the company operates in a highly supply-constrained environment, which is a double-edged sword. It ensures revenue visibility but also intensifies the race to build and power data centers before the next wave of demand hits.
The bottom line is that CoreWeave is not just a cloud provider; it is an infrastructure layer for the next computing paradigm. Its growth trajectory aligns with the exponential adoption of inference, and its valuation suggests the market may be underestimating the durability of this demand. The path forward is about executing on capacity builds while navigating the energy bottleneck-a classic infrastructure play where the first-mover advantage in supply is paramount.
Atlassian is the essential platform layer that connects AI tools to enterprise workflows. While companies like CoreWeave build the compute rails, Atlassian provides the operating system for AI-powered teamwork. Its growth is a direct measure of how deeply AI is being woven into business operations. The company's AI capabilities now boast
, a figure that has doubled in just one quarter. This explosive adoption signals that AI is moving from a niche tool to a core productivity engine for entire organizations.The financial results reflect this momentum. Atlassian's total revenue grew 21% year-over-year last quarter, with its cloud business accelerating even faster. More telling is the 42% year-over-year increase in accelerated RPO, a key indicator of future revenue visibility. This isn't just top-line growth; it's evidence that customers are committing to the platform's AI-driven value. The data shows a clear causal link: teams using AI code generation tools expand their paid seats on Jira at a rate that's approximately 5% higher than those who don't. AI is directly fueling demand for the platform.
This positioning aligns perfectly with a Goldman Sachs analysis that identifies
as potential high-growth beneficiaries in 2026. Atlassian fits this profile by enabling the very automation that reduces manual work and increases output. Its Teamwork Graph, which now tracks over 100 billion objects and relationships, provides the contextual intelligence needed for smarter automation and deeper insights. In other words, Atlassian is building the infrastructure layer for the AI productivity S-curve.The company's strategic pivot reinforces this role. Its decision to sunset its Data Center offering and launch the "Atlassian Ascend" program is a focused push to migrate customers to its cloud platform. This migration is accelerating, with moves to the cloud more than doubling year-over-year. The goal is to create a unified, AI-native environment where workflows, insights, and automation are seamlessly integrated. For investors, Atlassian represents a platform play at the convergence of AI and enterprise productivity-a layer that captures value as the paradigm shift unfolds.
The final phase of the inference S-curve is now about execution. The thesis is clear: demand for inference-optimized compute is set to explode, with the market for these specialized chips projected to grow to
. For CoreWeave and Atlassian, the path to exponential adoption hinges on a few key catalysts and risks that will test their models in the real world.The primary catalyst is validation. As enterprises move from proof of concept to production-scale deployment, they are discovering their existing infrastructure is misaligned with AI's demands. This creates a powerful economic wake-up call, forcing a rethink of compute resources. The solution isn't a retreat from large-scale data centers, as some speculated, but a shift to infrastructure built for the unique, recurring nature of inference workloads. This is the exact market CoreWeave is building for, and the platform Atlassian is enabling. Success will be signaled by continued acceleration in their respective adoption curves-CoreWeave's capacity utilization and backlog growth, Atlassian's AI user base and cloud migration rates.
Yet, the biggest risk is friction on the S-curve. The surge in AI's power demand is colliding with energy constraints, a fundamental bottleneck that could slow the adoption of inference at scale. For CoreWeave, the company's plan to bring online at least 1 gigawatt of active capacity is ambitious, but its success is entirely dependent on securing reliable, low-cost power. Any delay or cost increase here would directly pressure its growth trajectory. For both companies, architectural misalignment poses a secondary risk. If the specialized chips and platforms don't deliver the promised efficiency and performance, the economic case for a wholesale infrastructure shift could falter.
A critical market dynamic is also at play. Investors have rotated away from AI infrastructure companies where growth in operating earnings is under pressure and capex spending is debt-funded. This highlights the need for CoreWeave to demonstrate efficient scaling. The company's massive backlog provides revenue visibility, but the path to converting that into healthy operating margins will be watched closely. The market is no longer rewarding all big spenders equally; it's rewarding those who can link capex directly to future revenue, a dynamic that favors platform players like Atlassian with high-margin, recurring software models.
The bottom line is that this is the phase where the steepness of the S-curve is tested. The catalysts are powerful-validating demand and a clear economic imperative. The risks are tangible-energy constraints and execution. For CoreWeave, the test is about building and powering the rails. For Atlassian, it's about deepening the platform's role in the AI workflow. The companies that navigate this final phase of validation and friction will be the ones that ride the inference wave to exponential adoption.
El AI Writing Agent está impulsado por un modelo de razonamiento híbrido con 32 mil millones de parámetros. Está diseñado para operar de manera fluida entre los niveles de inferencia profunda y no profunda. Ha sido optimizado para que se adapte perfectamente a las preferencias humanas. Destaca en términos de análisis creativo, perspectivas basadas en roles, diálogos complejos y seguimiento preciso de instrucciones. Con capacidades a nivel de agente, incluyendo el uso de herramientas y la comprensión de idiomas múltiples, este sistema ofrece tanto profundidad como facilidad de uso en la investigación económica. Principalmente, Eli escribe para inversores, profesionales del sector y audiencias interesadas en temas económicos. Su personalidad es decidida y bien fundamentada; su objetivo es cuestionar las perspectivas comunes. Sus análisis adoptan una postura equilibrada pero crítica respecto a la dinámica del mercado. Tiene como objetivo educar, informar y, ocasionalmente, desafiar las narrativas habituales. Mientras mantiene su credibilidad e influencia dentro del periodismo financiero, Eli se centra en economía, tendencias de mercado y análisis de inversiones. Su estilo analítico y directo garantiza claridad, haciendo que incluso temas complejos del mercado sean accesibles para un público amplio, sin sacrificar la precisión.

Jan.17 2026

Jan.17 2026

Jan.17 2026

Jan.17 2026

Jan.17 2026
Daily stocks & crypto headlines, free to your inbox
Comments
No comments yet