Evolving Normalization-Activation Layers

Normalization layers and activation functions are fundamental components in deep networks and typically co-locate with each other. Here we propose to design them using an automated approach... (read more)

PDF Abstract NeurIPS 2020 PDF NeurIPS 2020 Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
RMSProp
Stochastic Optimization
Pointwise Convolution
Convolutions
Depthwise Convolution
Convolutions
Depthwise Separable Convolution
Convolutions
MobileNetV2
Image Models
Squeeze-and-Excitation Block
Image Model Blocks
Dropout
Regularization
Average Pooling
Pooling Operations
Dense Connections
Feedforward Networks
Inverted Residual Block
Skip Connection Blocks
EfficientNet
Image Models
Max Pooling
Pooling Operations
Global Average Pooling
Pooling Operations
Kaiming Initialization
Initialization
ResNet
Convolutional Neural Networks
Group Normalization
Normalization
FPN
Feature Extractors
Feedforward Network
Feedforward Networks
Conditional Batch Normalization
Normalization
TTUR
Optimization
GAN Hinge Loss
Loss Functions
Non-Local Operation
Image Feature Extractors
Non-Local Block
Image Model Blocks
Truncation Trick
Latent Variable Sampling
Linear Layer
Feedforward Networks
Dot-Product Attention
Attention Mechanisms
Projection Discriminator
Discriminators
Spectral Normalization
Normalization
Off-Diagonal Orthogonal Regularization
Regularization
Adam
Stochastic Optimization
Early Stopping
Regularization
Tanh Activation
Activation Functions
SAGAN Self-Attention Module
Attention Modules
SAGAN
Generative Adversarial Networks
BigGAN
Generative Models
RoIAlign
RoI Feature Extractors
Entropy Regularization
Regularization
Residual Connection
Skip Connections
Mask R-CNN
Instance Segmentation Models
Sigmoid Activation
Activation Functions
ReLU
Activation Functions
Softmax
Output Functions
Bottleneck Residual Block
Skip Connection Blocks
Residual Block
Skip Connection Blocks
Swish
Activation Functions
1x1 Convolution
Convolutions
Convolution
Convolutions
Batch Normalization
Normalization
SpineNet
Convolutional Neural Networks