Ethereum News Today: ETH Zurich and EPFL launch carbon-neutral open-source LLM with 1,500 language support
ETH Zurich and EPFL are set to release a fully open-source large language model (LLLM), trained on Switzerland’s carbon-neutral “Alps” supercomputer and expected to be available under the Apache 2.0 license later this year [1]. Unlike most proprietary AI systems, this model will allow unrestricted access to its weights, training code, and data methodology, enabling fine-tuning, auditing, and deployment by researchers and startups alike. This marks a departure from the “black-box” model approach of major commercial AI providers and aligns with the ethos of Web3 and permissionless innovation [1].
The model will be offered in two configurations—8 billion and 70 billion parameters—trained on 15 trillion tokens across more than 1,500 languages, with a balanced 60/40 English-to-non-English dataset. This multilingual coverage positions it as a strong alternative to English-centric models like GPT-4, especially for non-commercial and academic research [1]. Powered by 10,000 NvidiaNVDA-- Grace-Hopper chips, the training infrastructure is entirely powered by renewable energy, emphasizing sustainability and sovereignty [1].
A key differentiator is the model’s open-by-design architecture, which contrasts with the API-only access typical of closed models. It enables developers to run local inference and implement on-chain AI integrations, such as real-time legal document summarization in smart contracts or fraud detection in DeFi systems [1]. The model also complies with the EU AI Act, which, as of August 2, 2025, requires foundation models to publish training data summaries, audit logs, and adversarial testing results—requirements the Swiss model already meets [1].
Compared to other open models like Alibaba’s Qwen series, Switzerland’s LLM focuses on full transparency and multilingual depth rather than model diversity or deployment speed. While Qwen provides Mixture-of-Experts (MoE) architectures with up to 235 billion parameters, the Swiss model opts for two clean, research-oriented sizes [1]. In multilingual performance, the Swiss model outpaces Qwen, supporting over 1,500 languages compared to Qwen’s 119.
However, open-source LLMs face challenges, including performance gaps, implementation instability, integration complexity, and resource intensity. Issues like hallucinations, documentation deficiencies, and legal ambiguities also persist. Despite these hurdles, the model’s transparency and compliance advantages make it an appealing option for researchers and startups aiming to bypass vendor lock-in and API restrictions [1].
The AI market is projected to surpass $500 billion by the end of the decade, with over 80% currently controlled by closed providers. The blockchain-AI segment, meanwhile, is expected to grow from $550 million in 2024 to $4.33 billion by 2034, reflecting the rising demand for decentralized and auditable AI solutions [1]. As open-source models evolve, they may redefine how AI is developed, deployed, and regulated—especially in environments where trust, sustainability, and regulatory compliance are paramountPARA-- [1].
Source:
[1] This open-source LLM could redefine AI research, and it’s 100% public
https://cointelegraph.com/explained/this-open-source-llm-could-redefine-ai-research-and-its-100-public

Quickly understand the history and background of various well-known coins
Latest Articles
Stay ahead of the market.
Get curated U.S. market news, insights and key dates delivered to your inbox.

Comments
No comments yet