A treasure chest for visual classification and recognition powered by PaddlePaddle
-
Updated
May 24, 2024 - Python
A treasure chest for visual classification and recognition powered by PaddlePaddle
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Awesome Knowledge Distillation
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Pytorch implementation of various Knowledge Distillation (KD) methods.
EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
This is a collection of our NAS and Vision Transformer work.
OpenMMLab Model Compression Toolbox and Benchmark.
Efficient computing methods developed by Huawei Noah's Ark Lab
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Collection of AWESOME vision-language models for vision tasks
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)
Collection of recent methods on (deep) neural network compression and acceleration.
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."