Search Results for author: Lifeng Nai

Found 2 papers, 0 papers with code

TripLe: Revisiting Pretrained Model Reuse and Progressive Learning for Efficient Vision Transformer Scaling and Searching

no code implementations ICCV 2023 Cheng Fu, Hanxian Huang, Zixuan Jiang, Yun Ni, Lifeng Nai, Gang Wu, Liqun Cheng, Yanqi Zhou, Sheng Li, Andrew Li, Jishen Zhao

One promising way to accelerate transformer training is to reuse small pretrained models to initialize the transformer, as their existing representation power facilitates faster model convergence.

Knowledge Distillation Neural Architecture Search

Cannot find the paper you are looking for? You can Submit a new open access paper.