Search Results for author: Sarubi Thillainathan

Found 3 papers, 1 papers with code

Unlocking Parameter-Efficient Fine-Tuning for Low-Resource Language Translation

no code implementations5 Apr 2024 Tong Su, Xin Peng, Sarubi Thillainathan, David Guzmán, Surangika Ranathunga, En-Shiun Annie Lee

Parameter-efficient fine-tuning (PEFT) methods are increasingly vital in adapting large-scale pre-trained language models for diverse tasks, offering a balance between adaptability and computational efficiency.

Computational Efficiency Machine Translation +2

Leveraging Auxiliary Domain Parallel Data in Intermediate Task Fine-tuning for Low-resource Translation

1 code implementation2 Jun 2023 Shravan Nayak, Surangika Ranathunga, Sarubi Thillainathan, Rikki Hung, Anthony Rinaldi, Yining Wang, Jonah Mackey, Andrew Ho, En-Shiun Annie Lee

In this paper, we show that intermediate-task fine-tuning (ITFT) of PMSS models is extremely beneficial for domain-specific NMT, especially when target domain data is limited/unavailable and the considered languages are missing or under-represented in the PMSS model.

NMT

Cannot find the paper you are looking for? You can Submit a new open access paper.