Browse SoTA > Methodology > Transfer Learning > Multi-Task Learning

Multi-Task Learning

244 papers with code · Methodology
Subtask of Transfer Learning

Multi-task learning aims to learn multiple different tasks simultaneously while maximizing performance on one or all of the tasks.

( Image credit: Cross-stitch Networks for Multi-task Learning )

Benchmarks

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION PART-OF-SPEECH TAGGING UNSUPERVISED REPRESENTATION LEARNING

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

13 Mar 2017tensorflow/models

In this work, we present a compact, modular framework for constructing novel recurrent neural architectures.

DEPENDENCY PARSING MULTI-TASK LEARNING

One Model To Learn Them All

16 Jun 2017tensorflow/tensor2tensor

We present a single model that yields good results on a number of problems spanning multiple domains.

IMAGE CAPTIONING IMAGE CLASSIFICATION MULTI-TASK LEARNING

Language Models are Few-Shot Learners

28 May 2020openai/gpt-3

By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.

COMMON SENSE REASONING COREFERENCE RESOLUTION DOMAIN ADAPTATION FEW-SHOT LEARNING LANGUAGE MODELLING MULTI-TASK LEARNING NATURAL LANGUAGE INFERENCE QUESTION ANSWERING SENTENCE COMPLETION UNSUPERVISED MACHINE TRANSLATION WORD SENSE DISAMBIGUATION

Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

ICLR 2018 facebookresearch/InferSent

In this work, we present a simple, effective multi-task learning framework for sentence representations that combines the inductive biases of diverse training objectives in a single model.

MULTI-TASK LEARNING NATURAL LANGUAGE INFERENCE PARAPHRASE IDENTIFICATION SEMANTIC TEXTUAL SIMILARITY

The Microsoft Toolkit of Multi-Task Deep Neural Networks for Natural Language Understanding

ACL 2020 namisan/mt-dnn

We present MT-DNN, an open-source natural language understanding (NLU) toolkit that makes it easy for researchers and developers to train customized deep learning models.

MULTI-TASK LEARNING NATURAL LANGUAGE UNDERSTANDING STRUCTURED PREDICTION

Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding

20 Apr 2019namisan/mt-dnn

This paper explores the use of knowledge distillation to improve a Multi-Task Deep Neural Network (MT-DNN) (Liu et al., 2019) for learning text representations across multiple natural language understanding tasks.

MULTI-TASK LEARNING NATURAL LANGUAGE UNDERSTANDING

Towards Real-Time Multi-Object Tracking

ECCV 2020 Zhongdao/Towards-Realtime-MOT

In this paper, we propose an MOT system that allows target detection and appearance embedding to be learned in a shared model.

Ranked #2 on Multi-Object Tracking on MOT16 (using extra training data)

MULTI-OBJECT TRACKING MULTIPLE OBJECT TRACKING MULTI-TASK LEARNING