• Menu
  • Skip to right header navigation
  • Skip to main content
  • Skip to primary sidebar

DigiBanker

Bringing you cutting-edge new technologies and disruptive financial innovations.

  • Home
  • Pricing
  • Features
    • Overview Of Features
    • Search
    • Favorites
  • Share!
  • Log In
  • Home
  • Pricing
  • Features
    • Overview Of Features
    • Search
    • Favorites
  • Share!
  • Log In

Multiverse’s Model Zoo offers compact high-performing AI for device commands and for local reasoning—bringing powerful intelligence to home appliances, smartphones, and PCs via quantum compression

August 18, 2025 //  by Finnovate

AI startup Multiverse Computing has released two AI models that are the world’s smallest models that are still high-performing and can handle chat, speech, and even reasoning in one case. These new tiny models are intended to be embedded into Internet of Things devices, as well as run locally on smartphones, tablets, and PCs. “We can compress the model so much that they can fit on devices,” founder Román Orús told. “You can run them on premises, directly on your iPhone, or on your Apple Watch.” Its two new models are so small that they can bring chat AI capabilities to just about any IoT device and work without an internet connection. It humorously calls this family the Model Zoo because it’s naming the products based on animal brain sizes. A model it calls SuperFly is a compressed version of Hugging Face’s open source model SmolLM2-135. The original has 135 million parameters and was developed for on-device uses. SuperFly is 94 million parameters, which Orús likens to the size of a fly’s brain. “This is like having a fly, but a little bit more clever,” he said. SuperFly is designed to be trained on very restricted data, like a device’s operations. Multiverse envisions it embedded into home appliances, allowing users to operate them with voice commands like “start quick wash” for a washing machine. Or users can ask troubleshooting questions. With a little processing power (like an Arduino), the model can handle a voice interface. The other model is named ChickBrain, and is larger at 3.2 billion parameters, but is also far more capable and has reasoning capabilities. It’s a compressed version of Meta’s Llama 3.1 8B model, Multiverse says. Yet it’s small enough to run on a MacBook, no internet connection required. More importantly, Orús said that ChickBrain actually slightly outperforms the original in several standard benchmarks, including the language-skill benchmark MMLU-Pro, math skills benchmarks Math 500 and GSM8K, and the general knowledge benchmark GPQA Diamond.

Read Article

Category: Essential Guidance

Previous Post: « Embedded payments are seeing rising adoption in the parking sector through AI-recognition tech that lets customers just drive in and scan a QR code to enter their credit card information the first time they park, with automatic vehicle identification and charges applied on subsequent trips

Copyright © 2025 Finnovate Research · All Rights Reserved · Privacy Policy
Finnovate Research · Knyvett House · Watermans Business Park · The Causeway Staines · TW18 3BA · United Kingdom · About · Contact Us · Tel: +44-20-3070-0188

We use cookies to provide the best website experience for you. If you continue to use this site we will assume that you are happy with it.