1 code implementation • 14 Dec 2023 • Guiqin Wang, Peng Zhao, Yanjiang Shi, Cong Zhao, Shusen Yang
Addressing this gap, our paper introduces an innovative knowledge distillation framework, with the generative model for training a lightweight student model.
no code implementations • ICCV 2023 • Guiqin Wang, Peng Zhao, Cong Zhao, Shusen Yang, Jie Cheng, Luziwei Leng, Jianxing Liao, Qinghai Guo
To address this problem, we propose a novel attention-based hierarchically-structured latent model to learn the temporal variations of feature semantics.