Trending Research

On the Variance of the Adaptive Learning Rate and Beyond

8 Aug 2019LiyuanLucasLiu/RAdam

The learning rate warmup heuristic achieves remarkable success in stabilizing training, accelerating convergence and improving generalization for adaptive stochastic optimization algorithms like RMSprop and Adam.

IMAGE CLASSIFICATION LANGUAGE MODELLING MACHINE TRANSLATION STOCHASTIC OPTIMIZATION

775
3.18 stars / hour

FoveaBox: Beyond Anchor-based Object Detector

8 Apr 2019taokong/FoveaBox

We present FoveaBox, an accurate, flexible and completely anchor-free framework for object detection.

#22 best model for Object Detection on COCO

OBJECT DETECTION

144
1.16 stars / hour

Behaviour Suite for Reinforcement Learning

9 Aug 2019deepmind/bsuite

bsuite is a collection of carefully-designed experiments that investigate core capabilities of reinforcement learning (RL) agents with two objectives.

466
0.55 stars / hour

Data Programming: Creating Large Training Sets, Quickly

NeurIPS 2016 HazyResearch/snorkel

Additionally, in initial user studies we observed that data programming may be an easier way for non-experts to create machine learning models when training data is limited or unavailable.

RELATION MENTION EXTRACTION SLOT FILLING

3,053
0.55 stars / hour

U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation

25 Jul 2019taki0112/UGATIT

We propose a novel method for unsupervised image-to-image translation, which incorporates a new attention module and a new learnable normalization function in an end-to-end manner.

UNSUPERVISED IMAGE-TO-IMAGE TRANSLATION

3,163
0.52 stars / hour

Self-Attention Generative Adversarial Networks

arXiv 2018 jantic/DeOldify

In this paper, we propose the Self-Attention Generative Adversarial Network (SAGAN) which allows attention-driven, long-range dependency modeling for image generation tasks.

CONDITIONAL IMAGE GENERATION

7,671
0.34 stars / hour

Progressive Growing of GANs for Improved Quality, Stability, and Variation

ICLR 2018 jantic/DeOldify

We describe a new training methodology for generative adversarial networks.

FACE GENERATION

7,671
0.34 stars / hour

GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium

NeurIPS 2017 jantic/DeOldify

Generative Adversarial Networks (GANs) excel at creating realistic images with complex models for which maximum likelihood is infeasible.

IMAGE GENERATION

7,671
0.34 stars / hour