no code implementations • 12 Jan 2024 • Jialiang Tang, Shuo Chen, Gang Niu, Hongyuan Zhu, Joey Tianyi Zhou, Chen Gong, Masashi Sugiyama
Then, we build a fusion-activation mechanism to transfer the valuable domain-invariant knowledge to the student network, while simultaneously encouraging the adapter within the teacher network to learn the domain-specific knowledge of the target data.
no code implementations • ICCV 2023 • Jialiang Tang, Shuo Chen, Gang Niu, Masashi Sugiyama, Chen Gong
Knowledge distillation aims to learn a lightweight student network from a pre-trained teacher network.