Awesome Knowledge Distillation
-
Updated
Oct 14, 2022
Awesome Knowledge Distillation
Pytorch implementation of various Knowledge Distillation (KD) methods.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
PyContinual (An Easy and Extendible Framework for Continual Learning)
Code and dataset for ACL2018 paper "Exploiting Document Knowledge for Aspect-level Sentiment Classification"
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
This repository is mainly dedicated for listing the recent research advancements in the application of Self-Supervised-Learning in medical images computing field
Code and pretrained models for paper: Data-Free Adversarial Distillation
[ECCV2022] Factorizing Knowledge in Neural Networks
Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)
Code for NeurIPS 2020 Paper --- Continual Learning of a Mixed Sequence of Similar and Dissimilar Tasks
An Extensible Continual Learning Framework Focused on Language Models (LMs)
Code for ECML/PKDD 2020 Paper --- Continual Learning with Knowledge Transfer for Sentiment Classification
PyTorch implementation of (Hinton) Knowledge Distillation and a base class for simple implementation of other distillation methods.
Adaptive Model-based Transfer Evolutionary Algorithm
The Weizenbaum Institut „Knowledge Tool“ supports interdisciplinary debates by providing a workshop methodology based on predefined instructions and prefabricated workshop materials with the aim to structure and record multi-perspective exploration and analysis.
Learning in Growing Robots: Knowledge Transfer from Tadpole to Frog Robot
Lifelong Naive Bayes
Multiple methods' implementations to transfer the knowledge between Neural Networks and save/plot/compare the results.
Add a description, image, and links to the knowledge-transfer topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-transfer topic, visit your repo's landing page and select "manage topics."