Search Results for author: Itay Yona

Found 1 papers, 0 papers with code

Buffer Overflow in Mixture of Experts

no code implementations8 Feb 2024 Jamie Hayes, Ilia Shumailov, Itay Yona

Mixture of Experts (MoE) has become a key ingredient for scaling large foundation models while keeping inference costs steady.

Cannot find the paper you are looking for? You can Submit a new open access paper.