• Menu
  • Skip to right header navigation
  • Skip to main content
  • Skip to primary sidebar

DigiBanker

Bringing you cutting-edge new technologies and disruptive financial innovations.

  • Home
  • Pricing
  • Features
    • Overview Of Features
    • Search
    • Favorites
  • Share!
  • Log In
  • Home
  • Pricing
  • Features
    • Overview Of Features
    • Search
    • Favorites
  • Share!
  • Log In

 Sakana’s new M2N2 AI algorithm evolves hybrid AI models without retraining by flexibly mixing parameters beyond fixed layers, preserving diverse niches via competition, and pairing complements via attraction scoring

September 4, 2025 //  by Finnovate

A new evolutionary technique called Model Merging of Natural Niches (M2N2), from AI lab Sakana AI overcomes the limitations of other model merging methods and can even evolve new models entirely from scratch. The algorithm has three key features that allow it to explore a wider range of possibilities and discover more effective model combinations. First, M2N2 eliminates fixed merging boundaries, such as blocks or layers. Instead of grouping parameters by pre-defined layers, it uses flexible “split points” and “mixing ration” to divide and combine models. This means that, for example, the algorithm might merge 30% of the parameters in one layer from Model A with 70% of the parameters from the same layer in Model B. The process starts with an “archive” of seed models. At each step, M2N2 selects two models from the archive, determines a mixing ratio and a split point, and merges them. If the resulting model performs well, it is added back to the archive, replacing a weaker one. This allows the algorithm to explore increasingly complex combinations over time. Second, M2N2 manages the diversity of its model population through competition. M2N2 simulates competition for limited resources. This nature-inspired approach naturally rewards models with unique skills, as they can “tap into uncontested resources” and solve problems others can’t. These niche specialists, are the most valuable for merging. Third, M2N2 uses a heuristic called “attraction” to pair models for merging. Rather than simply combining the top-performing models as in other merging algorithms, it pairs them based on their complementary strengths.

Read Article

Category: Additional Reading

Previous Post: « Gaia Labs champions a decentralized AI economy where data, compute and expertise contributors retain ownership and auto earn usage‑based rewards through tracked agents, identity, compliance and payments rails.
Next Post: AI will separate winners in embedded payments, says NMI’s CEO, citing a modular platform that lets partners pick signup-to-payout functions while NMI handles the infrastructure “plumbing” »

Copyright © 2025 Finnovate Research · All Rights Reserved · Privacy Policy
Finnovate Research · Knyvett House · Watermans Business Park · The Causeway Staines · TW18 3BA · United Kingdom · About · Contact Us · Tel: +44-20-3070-0188

We use cookies to provide the best website experience for you. If you continue to use this site we will assume that you are happy with it.