no code implementations • 16 Nov 2022 • Juan Zha, Zheng Li, Ying WEI, Yu Zhang
However, most prior works assume that all the tasks are sampled from a single data source, which cannot adapt to real-world scenarios where tasks are heterogeneous and lie in different distributions.
1 code implementation • 25 May 2022 • Qinyuan Ye, Juan Zha, Xiang Ren
Recent works suggest that transformer models are capable of multi-tasking on diverse NLP tasks and adapting to new tasks efficiently.