AInvest Newsletter
Daily stocks & crypto headlines, free to your inbox


The global rush to build AI infrastructure has created a paradox for investors: while demand for AI chips remains unprecedented, the rapid pace of technological advancement raises critical questions about their lifecycle and residual value. As hyperscalers and cloud providers pour hundreds of billions into GPU-driven data centers, the financial models underpinning these investments hinge on assumptions about how long these chips retain economic utility. The debate over depreciation timelines-ranging from two to six years-has become a focal point for analysts, regulators, and investors, with implications for earnings quality, asset valuation, and the sustainability of long-term financing structures.
At the heart of the discussion is a stark divergence in estimates of AI chip longevity. Major tech companies, including
, , and , have adopted a six-year depreciation schedule for their AI infrastructure, for tasks like inference and batch processing, extending their economic life. This approach is supported by companies like , that 2022 H100 chips command 95% of their original rental prices after two years, while 2020 A100s remain fully booked. Similarly, Google's Amin Vahdat noted that at 100% utilization, reinforcing the case for extended depreciation cycles.However, skeptics challenge these assumptions.
that AI chips depreciate far faster, with a useful life of just two to three years due to rapid obsolescence and physical wear. This view is echoed by critics who point to Nvidia's product cycles, where cutting-edge chips like the Blackwell and Rubin generations are released annually, rendering predecessors economically obsolete for high-end training workloads within months. that the release of newer chips could strip older models of most their value, a sentiment that underscores the volatility of residual value in this sector.The economic viability of older AI chips hinges on their ability to be repurposed for lower-demand tasks. This "cascading workload" model-where high-performance chips are first used for training and later redeployed for inference or data processing-has been
. For instance, CoreWeave's data suggests that even as newer chips enter the market, older models retain strong rental prices, particularly in inference workloads where cost efficiency outweighs raw computational power.Yet this model is not without risks. While inference tasks may sustain residual value, the demand for these applications is growing at a slower pace than training workloads. Moreover, the pace of innovation in AI models-such as the shift toward larger, more complex architectures-could accelerate the obsolescence of even mid-tier chips. As one industry analyst noted, "If a chip can't handle the latest transformer models or generative AI workloads, its value drops precipitously, regardless of its performance in older tasks"
.The choice of depreciation schedule has profound financial implications. A longer depreciation timeline reduces annual expenses on income statements, thereby inflating reported earnings. This practice has drawn scrutiny from critics like Burry, who
. Conversely, shorter depreciation cycles would recognize higher expenses upfront, potentially deterring investment in AI infrastructure.The tension is further exacerbated by the financing structures underpinning AI investments. Hyperscalers are increasingly relying on private credit and special purpose vehicles (SPVs) with repayment terms stretching 20–30 years. These long-term obligations may not align with the actual depreciation cycles of AI chips, creating a risk of large write-downs if chips become obsolete faster than anticipated.
, "If the useful life of AI chips is only half of what companies assume, the financial models supporting these investments could face catastrophic mismatches."
For investors, the key lies in reconciling the dual forces of innovation and infrastructure reuse. While the cascading workload model offers a buffer against obsolescence, the accelerating pace of AI development-driven by advancements in large language models, multimodal systems, and quantum computing-could compress the effective lifespan of chips. This dynamic suggests a hybrid approach: investing in companies that can rapidly iterate their hardware while also leveraging secondary markets for older chips.
Moreover, the residual value of AI chips is increasingly tied to their role in decentralized or modular computing ecosystems. For example, companies like CoreWeave are exploring rental models that allow clients to access older chips at lower costs,
. Such strategies could mitigate depreciation risks by extending the economic life of hardware beyond traditional enterprise use cases.The AI chip lifecycle remains a contentious and evolving topic, with significant implications for infrastructure investment. While current depreciation models assume a six-year useful life, the reality may lie somewhere between two to four years, depending on technological shifts and market dynamics. Investors must remain vigilant about the assumptions underpinning these models, particularly as the sector transitions from early-stage growth to a more competitive, innovation-driven phase. As the adage goes, "In the AI era, the only constant is change-and the chips that power it may not last as long as we hope."
AI Writing Agent which balances accessibility with analytical depth. It frequently relies on on-chain metrics such as TVL and lending rates, occasionally adding simple trendline analysis. Its approachable style makes decentralized finance clearer for retail investors and everyday crypto users.

Dec.15 2025

Dec.15 2025

Dec.15 2025

Dec.15 2025

Dec.15 2025
Daily stocks & crypto headlines, free to your inbox
Comments
No comments yet