Search Results for author: Ziliang Zong

Found 13 papers, 4 papers with code

Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation

no code implementations1 Nov 2022 Cody Blakeney, Jessica Zosa Forde, Jonathan Frankle, Ziliang Zong, Matthew L. Leavitt

We conducted a series of experiments to investigate whether and how distillation can be used to accelerate training using ResNet-50 trained on ImageNet and BERT trained on C4 with a masked language modeling objective and evaluated on GLUE, using common enterprise hardware (8x NVIDIA A100).

Image Classification Language Modelling +1

Learning Omnidirectional Flow in 360-degree Video via Siamese Representation

no code implementations7 Aug 2022 Keshav Bhandari, Bin Duan, Gaowen Liu, Hugo Latapie, Ziliang Zong, Yan Yan

Optical flow estimation in omnidirectional videos faces two significant issues: the lack of benchmark datasets and the challenge of adapting perspective video-based methods to accommodate the omnidirectional nature.

Optical Flow Estimation Representation Learning

Lipschitz Continuity Retained Binary Neural Network

1 code implementation13 Jul 2022 Yuzhang Shang, Dan Xu, Bin Duan, Ziliang Zong, Liqiang Nie, Yan Yan

Relying on the premise that the performance of a binary neural network can be largely restored with eliminated quantization error between full-precision weight vectors and their corresponding binary vectors, existing works of network binarization frequently adopt the idea of model robustness to reach the aforementioned objective.

Binarization Quantization

Network Binarization via Contrastive Learning

1 code implementation6 Jul 2022 Yuzhang Shang, Dan Xu, Ziliang Zong, Liqiang Nie, Yan Yan

Neural network binarization accelerates deep models by quantizing their weights and activations into 1-bit.

Binarization Contrastive Learning +2

Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning

no code implementations30 Jan 2022 Yuzhang Shang, Bin Duan, Ziliang Zong, Liqiang Nie, Yan Yan

Extensive experiments on CIFAR-10 and CIFAR-100 demonstrate the superiority of our novel Fourier analysis based MBP compared to other traditional MBP algorithms.

Knowledge Distillation Network Pruning

Contrastive Mutual Information Maximization for Binary Neural Networks

no code implementations29 Sep 2021 Yuzhang Shang, Dan Xu, Ziliang Zong, Liqiang Nie, Yan Yan

Neural network binarization accelerates deep models by quantizing their weights and activations into 1-bit.

Binarization Contrastive Learning +2

Lipschitz Continuity Guided Knowledge Distillation

no code implementations ICCV 2021 Yuzhang Shang, Bin Duan, Ziliang Zong, Liqiang Nie, Yan Yan

Knowledge distillation has become one of the most important model compression techniques by distilling knowledge from larger teacher networks to smaller student ones.

Knowledge Distillation Model Compression +2

Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation

1 code implementation15 Jun 2021 Cody Blakeney, Nathaniel Huish, Yan Yan, Ziliang Zong

In recent years the ubiquitous deployment of AI has posed great concerns in regards to algorithmic bias, discrimination, and fairness.

Fairness Knowledge Distillation

Parallel Blockwise Knowledge Distillation for Deep Neural Network Compression

1 code implementation5 Dec 2020 Cody Blakeney, Xiaomin Li, Yan Yan, Ziliang Zong

The experimental results running on an AMD server with four Geforce RTX 2080Ti GPUs show that our algorithm can achieve 3x speedup plus 19% energy savings on VGG distillation, and 3. 5x speedup plus 29% energy savings on ResNet distillation, both with negligible accuracy loss.

Knowledge Distillation Neural Network Compression +3

Egok360: A 360 Egocentric Kinetic Human Activity Video Dataset

no code implementations15 Oct 2020 Keshav Bhandari, Mario A. DeLaGarza, Ziliang Zong, Hugo Latapie, Yan Yan

To bridge this gap, in this paper we propose a novel Egocentric (first-person) 360{\deg} Kinetic human activity video dataset (EgoK360).

Egocentric Activity Recognition Video Understanding

Revisiting Optical Flow Estimation in 360 Videos

no code implementations15 Oct 2020 Keshav Bhandari, Ziliang Zong, Yan Yan

Second, we refine the network by training with augmented data in a supervised manner.

Data Augmentation Domain Adaptation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.