Knowledge Distillation via Token-level Relationship Graph. (arXiv:2306.12442v1 [cs.LG])


Knowledge Distillation via Token-level Relationship Graph. (arXiv:2306.12442v1 [cs.LG])
By: <a href="http://arxiv.org/find/cs/1/au:+Zhang_S/0/1/0/all/0/1">Shuoxi Zhang</a>, <a href="http://arxiv.org/find/cs/1/au:+Liu_H/0/1/0/all/0/1">Hanpeng Liu</a>, <a href="http://arxiv.org/find/cs/1/au:+He_K/0/1/0/all/0/1">Kun He</a> Posted: June 23, 2023

Knowledge distillation is a powerful technique for transferring knowledge
from a pre-trained teacher model to a student model. However, the true
potential of knowledge transfer has not been fully explored. Existing
approaches primarily focus on distilling individual information or
instance-level relationships, overlooking the valuable information embedded in
token-level relationships, which may be particularly affected by the long-tail
effects. To address the above limitations, we propose a novel method called
Knowledge Distillation with Token-level Relationship Graph (TRG) that leverages
the token-wise relational knowledge to enhance the performance of knowledge
distillation. By employing TRG, the student model can effectively emulate
higher-level semantic information from the teacher model, resulting in improved
distillation results. To further enhance the learning process, we introduce a
token-wise contextual loss called contextual loss, which encourages the student
model to capture the inner-instance semantic contextual of the teacher model.
We conduct experiments to evaluate the effectiveness of the proposed method
against several state-of-the-art approaches. Empirical results demonstrate the
superiority of TRG across various visual classification tasks, including those
involving imbalanced data. Our method consistently outperforms the existing
baselines, establishing a new state-of-the-art performance in the field of
knowledge distillation.

Provided by:
http://arxiv.org/icons/sfx.gif

DoctorMorDi

DoctorMorDi

Moderator and Editor