COLLECTED BY
Organization:
Internet Archive
Focused crawls are collections of frequently-updated webcrawl data from narrow (as opposed to broad or wide) web crawls, often focused on a single domain or subdomain.
The Wayback Machine - https://web.archive.org/web/20200803221426/https://github.com/topics/knowledge-distillation
Here are
76 public repositories
matching this topic...
Awesome Knowledge Distillation
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Updated
Mar 27, 2020
Python
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Updated
Jul 11, 2018
Jupyter Notebook
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Updated
Jul 30, 2020
Python
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Updated
Jan 28, 2020
Python
PaddleSlim is an open-source library for deep model compression and architecture search.
Updated
Aug 3, 2020
Python
knowledge distillation papers
Data Efficient Model Compression
Updated
May 8, 2020
Python
A treasure chest for image classification powered by PaddlePaddle
Updated
Aug 3, 2020
Python
Updated
Dec 15, 2019
Python
Updated
Mar 27, 2020
Cuda
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
Updated
Jul 9, 2020
Python
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
Updated
Dec 9, 2019
Python
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
Updated
Nov 21, 2019
Python
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Updated
Jun 23, 2020
Python
Infrastructures™ for Machine Learning Training/Inference in Production.
Updated
Oct 3, 2019
Python
An Acceleration System for Large-scale Outlier Detection (Anomaly Detection)
Updated
Jun 15, 2020
Python
A large scale study of Knowledge Distillation.
Updated
Apr 19, 2020
Python
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
Updated
Jan 16, 2020
Python
Knowledge Distillation using Tensorflow
Updated
Aug 12, 2019
Python
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
Updated
Jul 25, 2020
Python
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Updated
Sep 9, 2019
Python
PyTorch code for our CVPR-20 paper "Collaborative Distillation for Ultra-Resolution Universal Style Transfer"
Updated
Jul 26, 2020
Python
[CVPR 2020] Dynamic Hierarchical Mimicking Towards Consistent Optimization Objectives
Updated
Jul 16, 2020
Python
The Easiest Knowledge Distillation Library for Lightweight Deep Learning
Updated
Jun 27, 2020
Python
Updated
Jul 5, 2018
Jupyter Notebook
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Updated
Jun 25, 2020
Python
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
Updated
Sep 9, 2019
Python
Zero-Shot Knowledge Distillation in Deep Networks in ICML2019
Updated
Jun 20, 2019
Python
Improve this page
Add a description, image, and links to the
knowledge-distillation
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
knowledge-distillation
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.