Search Results for author: Hanpeng Liu

Found 4 papers, 1 papers with code

Knowledge Distillation via Token-level Relationship Graph

no code implementations20 Jun 2023 Shuoxi Zhang, Hanpeng Liu, Kun He

To address the above limitations, we propose a novel method called Knowledge Distillation with Token-level Relationship Graph (TRG) that leverages the token-wise relational knowledge to enhance the performance of knowledge distillation.

Knowledge Distillation Transfer Learning

Class-aware Information for Logit-based Knowledge Distillation

no code implementations27 Nov 2022 Shuoxi Zhang, Hanpeng Liu, John E. Hopcroft, Kun He

Knowledge distillation aims to transfer knowledge to the student model by utilizing the predictions/features of the teacher model, and feature-based distillation has recently shown its superiority over logit-based distillation.

Knowledge Distillation

Feature Interaction Interpretability: A Case for Explaining Ad-Recommendation Systems via Neural Interaction Detection

1 code implementation ICLR 2020 Michael Tsang, Dehua Cheng, Hanpeng Liu, Xue Feng, Eric Zhou, Yan Liu

Recommendation is a prevalent application of machine learning that affects many users; therefore, it is important for recommender models to be accurate and interpretable.

BIG-bench Machine Learning Image Classification +1

Neural Interaction Transparency (NIT): Disentangling Learned Interactions for Improved Interpretability

no code implementations NeurIPS 2018 Michael Tsang, Hanpeng Liu, Sanjay Purushotham, Pavankumar Murali, Yan Liu

Neural networks are known to model statistical interactions, but they entangle the interactions at intermediate hidden layers for shared representation learning.

Additive models Representation Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.