Pytorch implementation of various Knowledge Distillation (KD) methods.
-
Updated
Nov 25, 2021 - Python
Pytorch implementation of various Knowledge Distillation (KD) methods.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
PyContinual (An Easy and Extendible Framework for Continual Learning)
An Extensible Continual Learning Framework Focused on Language Models (LMs)
Code and dataset for ACL2018 paper "Exploiting Document Knowledge for Aspect-level Sentiment Classification"
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Code and pretrained models for paper: Data-Free Adversarial Distillation
[ECCV2022] Factorizing Knowledge in Neural Networks
[Paper][AAAI 2023] DUET: Cross-modal Semantic Grounding for Contrastive Zero-shot Learning
Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)
PyTorch implementation of (Hinton) Knowledge Distillation and a base class for simple implementation of other distillation methods.
Code for ECML/PKDD 2020 Paper --- Continual Learning with Knowledge Transfer for Sentiment Classification
Code for NeurIPS 2020 Paper --- Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks
[NeurIPS'23] Source code of "Data-Centric Learning from Unlabeled Graphs with Diffusion Model": A data-centric transfer learning framework with diffusion model on graphs.
[arXiv 2024] PyTorch implementation of RRD: https://arxiv.org/abs/2407.12073
Implementation of NAACL 2024 main conference paper: Named Entity Recognition Under Domain Shift via Metric Learning for Life Science
[TPAMI'2024] Multi-sensor Learning Enables Information Transfer across Different Sensory Data and Augments Multi-modality Imaging
Functional Knowledge Transfer with Self-supervised Representation Learning (ICIP 2023)
Multiple methods' implementations to transfer the knowledge between Neural Networks and save/plot/compare the results.
The idea is to use pretrained deep learning algorithms such as mobilenet to predict objects in a image.
Add a description, image, and links to the knowledge-transfer topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-transfer topic, visit your repo's landing page and select "manage topics."