GPT-5 Disappoints, Shifts AI Focus to Practical Applications

Generated by AI AgentTicker Buzz
Sunday, Aug 17, 2025 4:03 am ET4min read
Aime RobotAime Summary

- OpenAI's GPT-5 underperformed expectations, failing to deliver AGI breakthroughs and exposing technical flaws like basic errors and personality shifts.

- Market focus shifts to practical AI commercialization as scaling laws hit data/compute limits, with competitors narrowing OpenAI's lead in cutting-edge development.

- Despite tech skepticism, AI venture capital remains robust (33% of global VC), driven by ChatGPT's $120B revenue and productization over AGI hype.

- Industry adapts to "AI winter" risks through multimodal models and frontline deployment, prioritizing cost-effective implementation over theoretical superintelligence.

OpenAI's highly anticipated GPT-5 model failed to deliver the revolutionary breakthroughs that many had hoped for. While the path to achieving Artificial General Intelligence (AGI) appears to have hit a roadblock, the market focus is shifting towards leveraging existing technologies to create broader commercial value in products and services.

Last week, the release of OpenAI's new model, GPT-5, was expected to be a major milestone for the company. The CEO had previously announced that GPT-5 would be a significant step towards AGI. However, the model's launch was met with widespread disappointment. Users shared instances of the new model making basic errors, such as incorrectly labeling a map of the United States, while experienced users expressed dissatisfaction with its performance and perceived changes in its "personality," noting that it performed poorly in benchmark tests.

This outcome, while not intended by OpenAI, clearly indicates a shift in the nature of the AI competition. Even if GPT-5 does not bring about extraordinary advancements in AGI or so-called superintelligence, it could still drive innovation in products and services created using AI models.

This situation has raised a critical question in Silicon Valley: after investing thousands of billions of dollars, has the progress of generative AI technology reached its current stage limit? This not only challenges OpenAI's valuation of 500 billion dollars but also prompts a re-evaluation of the trajectory of AI technology development.

Despite the skepticism surrounding technological advancements, the enthusiasm for capital markets and industrial applications has not waned. Investors seem to prioritize the actual growth of AI in commercial applications over the distant promise of AGI. This shift suggests that the second half of the AI competition will focus on more practical and cost-effective productization and implementation.

Over the past three years, AI researchers, users, and investors have become accustomed to the rapid pace of technological advancements. However, the release of GPT-5 disrupted this trend. The model's performance was described as "clumsy" due to technical issues, leading to widespread user complaints and even comparisons to its predecessor. The CEO acknowledged the "bumpy" launch, attributing it to a malfunction in the underlying "automatic switch," which caused the system to call upon a weaker model.

This disappointment is particularly stark because, prior to the release of GPT-5, the industry was filled with optimistic predictions about the imminent realization of AGI. The CEO had even predicted its arrival during the Trump administration. A renowned AI critic and professor of psychology and neural science at New York University expressed skepticism, stating that GPT-5 was a symbol of the "scaling to AGI" approach but had not succeeded.

Meanwhile, the competitive landscape of the industry has subtly changed. Competitors such as

, Anthropic, DeepSeek, and Musk's xAI have narrowed with OpenAI in cutting-edge development. The dominance of OpenAI is no longer as pronounced.

The underperformance of GPT-5 can be attributed to the core logic supporting the development of large language models—the "scaling laws"—reaching its limits. Over the past five years, companies like OpenAI and Anthropic have followed a simple formula: investing more data and computational power to create larger and better models.

However, this approach faces two major constraints. First, there is data exhaustion, as AI companies have nearly depleted all available free training data on the internet. While they are seeking new data sources through deals with publishers and copyright holders, it remains uncertain whether this will be sufficient to drive technological progress.

Second, there are physical and economic limitations on computational power. Training and running large AI models requires enormous energy consumption. It is estimated that the training of GPT-5 utilized hundreds of thousands of next-generation processors. The CEO also acknowledged that while the underlying AI models are still progressing rapidly, chatbots like ChatGPT will not improve further.

Signs of slowing technological progress have led some seasoned researchers to draw parallels with historical "AI winters." A professor of computer science at the University of California, Berkeley, warned that the current situation resembles the bubble burst of the 1980s, when technological innovations failed to deliver on their promises and could not provide investment returns. "The bubble burst, the system didn't make money, and we couldn't find enough high-value applications," the professor stated.

The professor cautioned that overly high expectations can easily lead to a collapse in investor confidence. If investors believe the bubble has been overinflated, "they will quickly exit through the nearest door, and the collapse could be extremely, extremely, extremely fast."

However, capital continues to flow into AI startups and infrastructure projects. This year, AI has accounted for 33% of the total global venture capital investment.

The nature of the competition is changing. Rather than a technological stalemate, the focus is shifting. A researcher at Princeton University noted that AI companies are gradually accepting the fact that they are building infrastructure for products. The researcher's team found that GPT-5's performance in various tasks was not significantly worse but excelled in cost-effectiveness and speed. This could open the door to innovation in products and services based on AI models, even if it does not bring about extraordinary advancements towards AGI. The chief scientist of Meta also believes that large language models trained purely on text are entering a phase of diminishing returns, but models based on multimodal data such as video, aimed at understanding the physical world, still hold immense potential.

This trend is also reflected in corporate strategies. Companies like OpenAI have begun deploying "frontline deployment engineers" to client companies to help integrate models. The researcher commented that if companies believed they were on the verge of automating all human work, they would not be taking such actions.

Despite ongoing debates among experts about the technological outlook, investors in Silicon Valley appear unfazed. AI-related stocks and startup valuations continue to soar, with the market capitalization of a major technology company reaching 4.4 trillion dollars, nearing its historical high. The stock price of OpenAI's investor has surged by over 50% in the past month.

What drives this investment enthusiasm is no longer the grand narrative of AGI but the robust growth of products like ChatGPT. It is reported that ChatGPT has generated 120 billion dollars in annual recurring revenue for OpenAI. An investor expressed that the company's products have become as ubiquitous as Google once was, "becoming a verb."

Many investors believe that there is still immense untapped value in the current generation of models. A partner at a venture capital firm stated that in the realms of commercial and consumer applications, startups and enterprises are only beginning to scratch the surface of these models' potential. As the chief scientist of Hugging Face noted, even if AGI or superintelligence cannot be achieved in the short term, "there are still many cool things that can be created." For the market, this may be the most important information for the current stage.

Comments



Add a public comment...
No comments

No comments yet