The Wayback Machine - https://web.archive.org/web/20230127225110/https://github.com/topics/knowledge-distillation
Skip to content
#

knowledge-distillation

Here are 313 public repositories matching this topic...

Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.

  • Updated Jan 27, 2023
  • Python

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.

  • Updated Dec 29, 2022
  • Python

Improve this page

Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."

Learn more