Search Results for author: Fotis Iliopoulos

Found 4 papers, 0 papers with code

SLaM: Student-Label Mixing for Distillation with Unlabeled Examples

no code implementations NeurIPS 2023 Vasilis Kontonis, Fotis Iliopoulos, Khoa Trinh, Cenk Baykal, Gaurav Menghani, Erik Vee

Knowledge distillation with unlabeled examples is a powerful training paradigm for generating compact and lightweight student models in applications where the amount of labeled data is limited but one has access to a large pool of unlabeled data.

Knowledge Distillation

Weighted Distillation with Unlabeled Examples

no code implementations13 Oct 2022 Fotis Iliopoulos, Vasilis Kontonis, Cenk Baykal, Gaurav Menghani, Khoa Trinh, Erik Vee

Our method is hyper-parameter free, data-agnostic, and simple to implement.

Robust Active Distillation

no code implementations3 Oct 2022 Cenk Baykal, Khoa Trinh, Fotis Iliopoulos, Gaurav Menghani, Erik Vee

Distilling knowledge from a large teacher model to a lightweight one is a widely successful approach for generating compact, powerful models in the semi-supervised learning setting where a limited amount of labeled data is available.

Active Learning Informativeness +1

Cannot find the paper you are looking for? You can Submit a new open access paper.