Search Results for author: Dingkun Zhang

Found 1 papers, 0 papers with code

LAPTOP-Diff: Layer Pruning and Normalized Distillation for Compressing Diffusion Models

no code implementations17 Apr 2024 Dingkun Zhang, Sijia Li, Chen Chen, Qingsong Xie, Haonan Lu

To this end, we proposed the layer pruning and normalized distillation for compressing diffusion models (LAPTOP-Diff).

Knowledge Distillation

Cannot find the paper you are looking for? You can Submit a new open access paper.