Search Results for author: Daeyeon Kim

Found 2 papers, 0 papers with code

Learning Deeply Shared Filter Bases for Efficient ConvNets

no code implementations1 Jan 2021 Woochul Kang, Daeyeon Kim

In the proposed ConvNet architecture, convolution layers are decomposed into a filter basis, that can be shared recursively, and layer-specific parts.

Deeply Shared Filter Bases for Parameter-Efficient Convolutional Neural Networks

no code implementations NeurIPS 2021 Woochul Kang, Daeyeon Kim

In this paper, we present a recursive convolution block design and training method, in which a recursively shareable part, or a filter basis, is separated and learned while effectively avoiding the vanishing/exploding gradients problem during training.

Image Classification object-detection +1

Cannot find the paper you are looking for? You can Submit a new open access paper.