Nvidia Researchers Predict Small Language Models as Future of AI, Warn of Sector Slowdown
PorAinvest
sábado, 13 de septiembre de 2025, 5:02 pm ET2 min de lectura
NVDA--
According to a recent report by ResearchAndMarkets.com, the "Small Language Models (SMLs) - Company Evaluation Report, 2025," SLMs are compact AI models designed to handle natural language processing (NLP) tasks efficiently and accurately. Unlike large language models (LLMs) with billions of parameters, SLMs operate with fewer than 2 billion parameters, significantly reducing memory requirements and energy consumption. This makes them ideal for environments with limited resources, such as edge devices and mobile applications [1].
Key players in the SLM market, including Microsoft, IBM, and Infosys, are investing heavily in research and development, forming strategic partnerships, and driving innovation to maintain a competitive edge. Microsoft, for instance, integrates its SLMs with the Azure AI platform, while IBM focuses on enterprise AI solutions with a strong emphasis on security and compliance. Infosys, on the other hand, is expanding its SLM offerings to cater to domain-specific applications, emphasizing privacy-first AI adoption trends [1].
The market for SLMs is expected to grow, driven by regulatory compliance, affordable AI solutions, advancements in model compression, and industry-specific AI models. However, challenges such as shallow contextual understanding, lack of multimodal processing, and fragmented development tools may hinder standardization and growth. Despite these challenges, opportunities such as self-optimizing AI models, automated AI model optimization, and specialized AI infrastructure are poised to drive the market forward [1].
Citi Research has also noted the impact of Broadcom's XPU YoY growth on Nvidia's 2026 GPU sales, which could lead to an approximate 4% decline. However, Nvidia's revised 2025/2026 estimates remain above consensus on increased spending on emerging cloud and sovereign AI. The broker's new target price of US$200 is based on the revised 2026 EPS multiplied by a consistent 30x PE ratio, with a rating of Buy [2].
In conclusion, the adoption of SLMs is gaining traction due to their efficiency and cost-effectiveness. Companies are urged to consider SLMs for specific tasks to optimize their AI solutions. The market is expected to grow, driven by regulatory compliance, affordable AI solutions, and advancements in model compression. However, challenges such as shallow contextual understanding and lack of multimodal processing may hinder growth. Despite these challenges, opportunities such as self-optimizing AI models and specialized AI infrastructure are poised to drive the market forward.
Nvidia researchers believe Small Language Models (SLMs) are the future of AI, as they are cheaper and more efficient for specific tasks than Large Language Models (LLMs). SLMs are trained on up to 40 billion parameters, excel at narrow tasks, and consume fewer resources. LLMs are expensive and not necessarily the best fit for every task. Nvidia urges companies to work with smaller models.
Nvidia researchers have highlighted the potential of Small Language Models (SLMs) as the future of artificial intelligence (AI), emphasizing their cost-effectiveness and efficiency for specific tasks compared to Large Language Models (LLMs). SLMs, trained on up to 40 billion parameters, excel in narrow tasks and consume fewer resources, making them an attractive alternative to LLMs, which are often more expensive and not necessarily the best fit for every task. Nvidia is urging companies to consider working with smaller models to optimize their AI solutions.According to a recent report by ResearchAndMarkets.com, the "Small Language Models (SMLs) - Company Evaluation Report, 2025," SLMs are compact AI models designed to handle natural language processing (NLP) tasks efficiently and accurately. Unlike large language models (LLMs) with billions of parameters, SLMs operate with fewer than 2 billion parameters, significantly reducing memory requirements and energy consumption. This makes them ideal for environments with limited resources, such as edge devices and mobile applications [1].
Key players in the SLM market, including Microsoft, IBM, and Infosys, are investing heavily in research and development, forming strategic partnerships, and driving innovation to maintain a competitive edge. Microsoft, for instance, integrates its SLMs with the Azure AI platform, while IBM focuses on enterprise AI solutions with a strong emphasis on security and compliance. Infosys, on the other hand, is expanding its SLM offerings to cater to domain-specific applications, emphasizing privacy-first AI adoption trends [1].
The market for SLMs is expected to grow, driven by regulatory compliance, affordable AI solutions, advancements in model compression, and industry-specific AI models. However, challenges such as shallow contextual understanding, lack of multimodal processing, and fragmented development tools may hinder standardization and growth. Despite these challenges, opportunities such as self-optimizing AI models, automated AI model optimization, and specialized AI infrastructure are poised to drive the market forward [1].
Citi Research has also noted the impact of Broadcom's XPU YoY growth on Nvidia's 2026 GPU sales, which could lead to an approximate 4% decline. However, Nvidia's revised 2025/2026 estimates remain above consensus on increased spending on emerging cloud and sovereign AI. The broker's new target price of US$200 is based on the revised 2026 EPS multiplied by a consistent 30x PE ratio, with a rating of Buy [2].
In conclusion, the adoption of SLMs is gaining traction due to their efficiency and cost-effectiveness. Companies are urged to consider SLMs for specific tasks to optimize their AI solutions. The market is expected to grow, driven by regulatory compliance, affordable AI solutions, and advancements in model compression. However, challenges such as shallow contextual understanding and lack of multimodal processing may hinder growth. Despite these challenges, opportunities such as self-optimizing AI models and specialized AI infrastructure are poised to drive the market forward.

Divulgación editorial y transparencia de la IA: Ainvest News utiliza tecnología avanzada de Modelos de Lenguaje Largo (LLM) para sintetizar y analizar datos de mercado en tiempo real. Para garantizar los más altos estándares de integridad, cada artículo se somete a un riguroso proceso de verificación con participación humana.
Mientras la IA asiste en el procesamiento de datos y la redacción inicial, un miembro editorial profesional de Ainvest revisa, verifica y aprueba de forma independiente todo el contenido para garantizar su precisión y cumplimiento con los estándares editoriales de Ainvest Fintech Inc. Esta supervisión humana está diseñada para mitigar las alucinaciones de la IA y garantizar el contexto financiero.
Advertencia sobre inversiones: Este contenido se proporciona únicamente con fines informativos y no constituye asesoramiento profesional de inversión, legal o financiero. Los mercados conllevan riesgos inherentes. Se recomienda a los usuarios que realicen una investigación independiente o consulten a un asesor financiero certificado antes de tomar cualquier decisión. Ainvest Fintech Inc. se exime de toda responsabilidad por las acciones tomadas con base en esta información. ¿Encontró un error? Reportar un problema

Comentarios
Aún no hay comentarios