Funding | 6/13/2025

Multiverse Computing Raises $215M to Revolutionize AI Model Compression

Spanish startup Multiverse Computing has raised $215 million in Series B funding to advance its AI model compression technology, CompactifAI. The technology aims to reduce AI model sizes by up to 95% while maintaining performance, potentially transforming AI accessibility and economics.

Multiverse Computing's Funding Boost

Spanish quantum software startup Multiverse Computing has announced a significant achievement, securing $215 million in Series B funding. This investment is intended to scale the company's innovative AI compression technology, CompactifAI, which promises to address the challenges of large language models (LLMs) by reducing their size by up to 95% without compromising performance.

Technology and Investment Details

The funding round was led by Bullhound Capital and included contributions from prominent investors such as HP Tech Ventures, SETT, Forgepoint Capital International, CDP Venture Capital, Santander Climate VC, Quantonation, and Toshiba. This capital injection, comprising equity, grants, and partnerships, is poised to accelerate the global adoption of Multiverse's compressed AI models, potentially impacting the $106 billion AI inference market.

CompactifAI: A Quantum-Inspired Solution

Multiverse's CompactifAI technology utilizes quantum physics principles to compress AI models more effectively than traditional methods like pruning and quantization, which often degrade performance. By employing Tensor Networks, the technology simplifies neural networks, identifying and eliminating redundant parameters with minimal loss in precision. This approach allows existing open-source LLMs to become smaller, faster, and more efficient.

Implications for AI Deployment

The reduction in model size could significantly lower the computational and energy costs associated with running large-scale LLMs, enabling broader deployment across various hardware, including edge devices. This shift could enhance performance, improve data privacy, and reduce costs, with CompactifAI models reportedly being 4 to 12 times faster and reducing inference costs by 50% to 80%.

Future Prospects and Expansion

With over 100 clients and more than 160 patents, Multiverse Computing plans to use the new funds to expand product development and global deployment of its compressed models. The company aims to commercialize these models, making advanced AI more accessible and sustainable across industries such as finance, healthcare, manufacturing, and defense.

Conclusion

Multiverse Computing's $215 million funding round marks a pivotal moment in AI technology, with its CompactifAI technology offering a promising solution to the challenges of AI model size, cost, and energy consumption. The strong investor backing highlights the potential of this technology to become integral to the evolving AI infrastructure, potentially unlocking new innovations and benefits for businesses and consumers worldwide.