Search Results for author: Beyza Ermiş

Found 1 papers, 1 papers with code

Pushing Mixture of Experts to the Limit: Extremely Parameter Efficient MoE for Instruction Tuning

1 code implementation11 Sep 2023 Ted Zadouri, Ahmet Üstün, Arash Ahmadian, Beyza Ermiş, Acyr Locatelli, Sara Hooker

The Mixture of Experts (MoE) is a widely known neural architecture where an ensemble of specialized sub-models optimizes overall performance with a constant computational cost.

Cannot find the paper you are looking for? You can Submit a new open access paper.