Polynomial Rate Decay

Polynomial Rate Decay is a learning rate schedule where we polynomially decay the learning rate.

Latest Papers

PAPER DATE
YOLOv4: Optimal Speed and Accuracy of Object Detection
| Alexey BochkovskiyChien-Yao WangHong-Yuan Mark Liao
2020-04-23
LiteSeg: A Novel Lightweight ConvNet for Semantic Segmentation
| Taha EmaraHossam E. Abd El MunimHazem M. Abbas
2019-12-13
CSPNet: A New Backbone that can Enhance Learning Capability of CNN
| Chien-Yao WangHong-Yuan Mark LiaoI-Hau YehYueh-Hua WuPing-Yang ChenJun-Wei Hsieh
2019-11-27
PSANet: Point-wise Spatial Attention Network for Scene Parsing
| Hengshuang ZhaoYi ZhangShu LiuJianping ShiChen Change LoyDahua LinJiaya Jia
2018-09-01
SqueezeNext: Hardware-Aware Neural Network Design
| Amir GholamiKiseok KwonBichen WuZizheng TaiXiangyu YuePeter JinSicheng ZhaoKurt Keutzer
2018-03-23
Rethinking Atrous Convolution for Semantic Image Segmentation
| Liang-Chieh ChenGeorge PapandreouFlorian SchroffHartwig Adam
2017-06-17
YOLO9000: Better, Faster, Stronger
| Joseph RedmonAli Farhadi
2016-12-25
Pyramid Scene Parsing Network
| Hengshuang ZhaoJianping ShiXiaojuan QiXiaogang WangJiaya Jia
2016-12-04
DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs
| Liang-Chieh ChenGeorge PapandreouIasonas KokkinosKevin MurphyAlan L. Yuille
2016-06-02

Components

COMPONENT TYPE
🤖 No Components Found You can add them if they exist; e.g. Mask R-CNN uses RoIAlign

Categories