AI2’s new model leverages a sparse mixture of experts (MoE) architecture- it has 7 billion parameters but uses only 1 billion parameters per input token September 10, 2024 // by Finnovate This content is for members only. Sign up for access to the latest trends and innovations in fintech. View subscription plans. Login