Multiverse Launches Smallest High-Performing AI Models with Quantum-Inspired Compression

Generated by AI AgentCoin World
Thursday, Aug 14, 2025 11:41 am ET2min read
Aime RobotAime Summary

- Multiverse Computing launches world’s smallest high-performing AI models using quantum-inspired compression.

- Models like SuperFly and ChickBrain enable on-device processing with enhanced privacy and lower costs.

- Partnerships with Apple, Samsung, and AWS via API democratize access to compact AI technology.

- The innovation shifts AI from cloud dependency to localized processing in consumer devices.

Multiverse Computing, a European artificial intelligence startup based in Donostia, Spain, has announced the launch of the world’s smallest high-performing AI models, challenging the conventional trend of increasingly large and complex AI systems. By leveraging a proprietary quantum-inspired compression algorithm called CompactifAI, the company has managed to significantly reduce the size of existing AI models without compromising their performance [1]. This breakthrough has the potential to shift the landscape of AI deployment by enabling on-device processing, offering benefits such as offline functionality, enhanced privacy, reduced latency, and lower operational costs [1].

Among the newly introduced models is SuperFly, a compressed version of Hugging Face’s SmolLM2 135. Despite being reduced to 94 million parameters from the original 135 million, SuperFly is designed for highly restricted data environments and is ideal for embedding into home appliances. For instance, it could enable a washing machine to respond instantly to voice commands like “start quick wash” [1]. Another standout is ChickBrain, a 3.2-billion-parameter model derived from Meta’s Llama 3.1 8B. ChickBrain is not only compact enough to run on a MacBook offline but also performs slightly better than its parent model in several benchmarks, including language comprehension (MMLU-Pro), mathematical reasoning (Math 500 and GSM8K), and general knowledge (GPQA Diamond) [1].

Multiverse’s compression approach, as explained by co-founder Román Orús, is fundamentally different from traditional methods, drawing inspiration from quantum physics to achieve a more refined and efficient reduction in model size [1]. The company, having raised €189 million in its most recent funding round, is well-positioned to scale its technology and bring it to a wide range of industries and applications [1].

The strategic value of Multiverse’s innovation is evident in its partnerships with major manufacturers, including

, Samsung, , and , as well as its collaboration with companies like BASF, , , and Bosch. These relationships underscore the broad applicability of its technology across various sectors [1]. Moreover, to democratize access to its models, Multiverse is hosting its compressed AI models on AWS via an API, allowing developers to integrate them into their applications at competitive token fees [1].

By pushing the boundaries of AI model efficiency, Multiverse Computing is redefining what is possible in the realm of device-based intelligence. The company’s focus on compact, high-performing models is not only enabling smarter home appliances and consumer electronics but also aligning with growing demands for data privacy and operational efficiency in an increasingly connected world [1].

The development of on-device AI represents a shift from cloud dependency to localized processing, which could significantly alter how AI is integrated into everyday devices. As the company continues to refine its technology and expand its partnerships, the implications for the AI industry and consumer technology are profound, signaling a future where powerful AI is no longer confined to the cloud but resides directly in the devices we use every day [1].

Source: [1] Revolutionary AI Models: Multiverse Unveils World’s Smallest High-Performing AI (https://coinmarketcap.com/community/articles/689e00d1bde83c23c11407eb/)

Comments



Add a public comment...
No comments

No comments yet