Multi-task learning aims to learn multiple different tasks simultaneously while maximizing performance on one or all of the tasks.
|Trend||Dataset||Best Method||Paper title||Paper||Code||Compare|
We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.
SOTA for CCG Supertagging on CCGBank
In this work, we present a simple, effective multi-task learning framework for sentence representations that combines the inductive biases of diverse training objectives in a single model.
This paper explores the use of knowledge distillation to improve a Multi-Task Deep Neural Network (MT-DNN) (Liu et al., 2019) for learning text representations across multiple natural language understanding tasks.
The model is trained in a hierarchical fashion to introduce an inductive bias by supervising a set of low level tasks at the bottom layers of the model and more complex tasks at the top layers of the model.
SOTA for Relation Extraction on 1B Words
Multi-task learning (MTL) has led to successes in many applications of machine learning, from natural language processing and speech recognition to computer vision and drug discovery.
Lane detection is an important yet challenging task in autonomous driving, which is affected by many factors, e. g., light conditions, occlusions caused by other vehicles, irrelevant markings on the road and the inherent long and thin property of lanes.
#2 best model for Lane Detection on CULane
Numerous deep learning applications benefit from multi-task learning with multiple regression and classification objectives.