ICLR 2020

Plug and Play Language Models: A Simple Approach to Controlled Text Generation

ICLR 2020 huggingface/transformers

Large transformer-based language models (LMs) trained on huge text corpora have shown unparalleled generation capabilities.

LANGUAGE MODELLING TEXT GENERATION

Reformer: The Efficient Transformer

ICLR 2020 huggingface/transformers

Large Transformer models routinely achieve state-of-the-art results on a number of tasks but training these models can be prohibitively costly, especially on long sequences.

LANGUAGE MODELLING

ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators

ICLR 2020 huggingface/transformers

Then, instead of training a model that predicts the original identities of the corrupted tokens, we train a discriminative model that predicts whether each token in the corrupted input was replaced by a generator sample or not.

LANGUAGE MODELLING NATURAL LANGUAGE UNDERSTANDING

Well-Read Students Learn Better: On the Importance of Pre-training Compact Models

ICLR 2020 google-research/bert

Recent developments in natural language representations have been accompanied by large and expensive models that leverage vast amounts of general-domain text through self-supervised pre-training.

LANGUAGE MODELLING MODEL COMPRESSION SENTIMENT ANALYSIS

ProtoAttend: Attention-Based Prototypical Learning

ICLR 2020 google-research/google-research

We propose a novel inherently interpretable machine learning method that bases decisions on few relevant examples that we call prototypes.

DECISION MAKING INTERPRETABLE MACHINE LEARNING

ProtoAttend: Attention-Based Prototypical Learning

ICLR 2020 google-research/google-research

We propose a novel inherently interpretable machine learning method that bases decisions on few relevant examples that we call prototypes.

DECISION MAKING INTERPRETABLE MACHINE LEARNING

On Mutual Information Maximization for Representation Learning

ICLR 2020 google-research/google-research

Many recent methods for unsupervised or self-supervised representation learning train feature extractors by maximizing an estimate of the mutual information (MI) between different views of the data.

REPRESENTATION LEARNING SELF-SUPERVISED IMAGE CLASSIFICATION

TabNet: Attentive Interpretable Tabular Learning

ICLR 2020 google-research/google-research

We propose a novel high-performance and interpretable canonical deep tabular data learning architecture, TabNet.

DECISION MAKING FEATURE SELECTION SELF-SUPERVISED LEARNING UNSUPERVISED REPRESENTATION LEARNING