NVIDIA Announces Spectrum-XGS Ethernet for Giga-Scale AI Superfactories
PorAinvest
viernes, 22 de agosto de 2025, 12:08 pm ET1 min de lectura
NVDA--
Spectrum-XGS Ethernet is integrated into NVIDIA's Spectrum-X platform and offers a 1.6x greater bandwidth density compared to off-the-shelf Ethernet. It features advanced algorithms that dynamically adjust the network to the distance between data centers, optimizing latency and congestion control. This technology nearly doubles the performance of NVIDIA's Collective Communications Library, significantly enhancing communication efficiency across geographically dispersed AI clusters.
The technology is already being adopted by AI hyperscaler CoreWeave, which will be among the first to connect its data centers with Spectrum-XGS Ethernet. CoreWeave's commitment to utilizing this new infrastructure positions NVIDIA as a key player in the AI infrastructure market, potentially attracting further partnerships with other hyperscale innovators.
NVIDIA's Spectrum-XGS Ethernet is available now as part of the NVIDIA Spectrum-X Ethernet platform. The technology is designed to support the growing demand for AI capabilities while reducing energy consumption and operational costs across the industry.
References:
[1] https://nvidianews.nvidia.com/news/nvidia-introduces-spectrum-xgs-ethernet-to-connect-distributed-data-centers-into-giga-scale-ai-super-factories
[2] https://seekingalpha.com/news/4488377-nvidia-unveils-spectrum-xgs-ethernet-to-give-boost-to-ai-super-factories
[3] https://www.quiverquant.com/news/NVIDIA+Introduces+Spectrum-XGS+Ethernet+to+Enable+Giga-Scale+AI+Super-Factories+Across+Distributed+Data+Centers
NVIDIA has announced Spectrum-XGS Ethernet, a scale-across technology for combining multiple data centers into "unified, giga-scale AI super-factories." The platform provides 1.6x greater bandwidth density than off-the-shelf Ethernet for multi-tenant, hyperscale AI factories. It offers auto-adjusted distance congestion control, latency management, and end-to-end telemetry, nearly doubling the performance of NVIDIA's Collective Communications Library. The new offering builds on NVIDIA's partnership with AI hyperscaler CoreWeave, which will be among the first to connect its data centers with Spectrum-XGS Ethernet.
NVIDIA has announced Spectrum-XGS Ethernet, a groundbreaking technology designed to combine multiple distributed data centers into unified, giga-scale AI super-factories. This new offering aims to address the limitations of individual data centers by providing a scalable and efficient infrastructure for AI operations.Spectrum-XGS Ethernet is integrated into NVIDIA's Spectrum-X platform and offers a 1.6x greater bandwidth density compared to off-the-shelf Ethernet. It features advanced algorithms that dynamically adjust the network to the distance between data centers, optimizing latency and congestion control. This technology nearly doubles the performance of NVIDIA's Collective Communications Library, significantly enhancing communication efficiency across geographically dispersed AI clusters.
The technology is already being adopted by AI hyperscaler CoreWeave, which will be among the first to connect its data centers with Spectrum-XGS Ethernet. CoreWeave's commitment to utilizing this new infrastructure positions NVIDIA as a key player in the AI infrastructure market, potentially attracting further partnerships with other hyperscale innovators.
NVIDIA's Spectrum-XGS Ethernet is available now as part of the NVIDIA Spectrum-X Ethernet platform. The technology is designed to support the growing demand for AI capabilities while reducing energy consumption and operational costs across the industry.
References:
[1] https://nvidianews.nvidia.com/news/nvidia-introduces-spectrum-xgs-ethernet-to-connect-distributed-data-centers-into-giga-scale-ai-super-factories
[2] https://seekingalpha.com/news/4488377-nvidia-unveils-spectrum-xgs-ethernet-to-give-boost-to-ai-super-factories
[3] https://www.quiverquant.com/news/NVIDIA+Introduces+Spectrum-XGS+Ethernet+to+Enable+Giga-Scale+AI+Super-Factories+Across+Distributed+Data+Centers

Divulgación editorial y transparencia de la IA: Ainvest News utiliza tecnología avanzada de Modelos de Lenguaje Largo (LLM) para sintetizar y analizar datos de mercado en tiempo real. Para garantizar los más altos estándares de integridad, cada artículo se somete a un riguroso proceso de verificación con participación humana.
Mientras la IA asiste en el procesamiento de datos y la redacción inicial, un miembro editorial profesional de Ainvest revisa, verifica y aprueba de forma independiente todo el contenido para garantizar su precisión y cumplimiento con los estándares editoriales de Ainvest Fintech Inc. Esta supervisión humana está diseñada para mitigar las alucinaciones de la IA y garantizar el contexto financiero.
Advertencia sobre inversiones: Este contenido se proporciona únicamente con fines informativos y no constituye asesoramiento profesional de inversión, legal o financiero. Los mercados conllevan riesgos inherentes. Se recomienda a los usuarios que realicen una investigación independiente o consulten a un asesor financiero certificado antes de tomar cualquier decisión. Ainvest Fintech Inc. se exime de toda responsabilidad por las acciones tomadas con base en esta información. ¿Encontró un error? Reportar un problema

Comentarios
Aún no hay comentarios