On Data Augmentation for GAN Training

Recent successes in Generative Adversarial Networks (GAN) have affirmed the importance of using more data in GAN training. Yet it is expensive to collect data in many domains such as medical applications... (read more)

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Residual Connection
Skip Connections
ReLU
Activation Functions
Instance Normalization
Normalization
Sigmoid Activation
Activation Functions
Batch Normalization
Normalization
Tanh Activation
Activation Functions
Cycle Consistency Loss
Loss Functions
PatchGAN
Discriminators
Residual Block
Skip Connection Blocks
Leaky ReLU
Activation Functions
Convolution
Convolutions
GAN Least Squares Loss
Loss Functions
CycleGAN
Generative Models