Search Results for author: Xu Dai

Found 2 papers, 1 papers with code

Precise Knowledge Transfer via Flow Matching

no code implementations3 Feb 2024 Shitong Shao, Zhiqiang Shen, Linrui Gong, Huanran Chen, Xu Dai

We name this framework Knowledge Transfer with Flow Matching (FM-KT), which can be integrated with a metric-based distillation method with any form (\textit{e. g.} vanilla KD, DKD, PKD and DIST) and a meta-encoder with any available architecture (\textit{e. g.} CNN, MLP and Transformer).

Transfer Learning

Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling

1 code implementation18 May 2023 Shitong Shao, Xu Dai, Shouyi Yin, Lujun Li, Huanran Chen, Yang Hu

On CIFAR-10, we obtain a FID of 2. 80 by sampling in 15 steps under one-session training and the new state-of-the-art FID of 3. 37 by sampling in one step with additional training.

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.