A treasure chest for visual classification and recognition powered by PaddlePaddle
-
Updated
Jan 21, 2023 - Python
A treasure chest for visual classification and recognition powered by PaddlePaddle
Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Pytorch implementation of various Knowledge Distillation (KD) methods.
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
OpenMMLab Model Compression Toolbox and Benchmark.
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
A coding-free framework built on PyTorch for reproducible deep learning studies.
This is a collection of our NAS and Vision Transformer work.
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
Collection of recent methods on (deep) neural network compression and acceleration.
knowledge distillation papers
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."