1 code implementation • 16 Apr 2024 • Songtao Jiang, Tuo Zheng, Yan Zhang, Yeying Jin, Zuozhu Liu
Mixture of Expert Tuning (MoE-Tuning) has effectively enhanced the performance of general MLLMs with fewer parameters, yet its application in resource-limited medical settings has not been fully explored.