Search Results for author: Yiqian He

Found 1 papers, 0 papers with code

RdimKD: Generic Distillation Paradigm by Dimensionality Reduction

no code implementations14 Dec 2023 Yi Guo, Yiqian He, Xiaoyang Li, Haotong Qin, Van Tung Pham, Yang Zhang, Shouda Liu

Knowledge Distillation (KD) emerges as one of the most promising compression technologies to run advanced deep neural networks on resource-limited devices.

Dimensionality Reduction Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.