'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)
-
Updated
Oct 10, 2023 - Python
'NKD and USKD' (ICCV 2023) and 'ViTKD' (CVPRW 2024)
[ECCV-2022] Official implementation of MixSKD: Self-Knowledge Distillation from Mixup for Image Recognition && Pytorch Implementations of some Self-Knowledge Distillation and data augmentation methods
The offical implementation of [NeurIPS2024] Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation https://arxiv.org/abs/2412.08139
The code for "Efficient-PrototypicalNet with self knowledge distillation for few-shot learning"
Semi-PKD: Semi-supervised Pseudoknowledge Distillation for saliency prediction
Add a description, image, and links to the self-knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the self-knowledge-distillation topic, visit your repo's landing page and select "manage topics."