Use HugginFace Transfromers and Pracetice Knowledge distillation, quantization, ONNX, ORT
-
Updated
Feb 10, 2023 - Jupyter Notebook
Use HugginFace Transfromers and Pracetice Knowledge distillation, quantization, ONNX, ORT
Deep Learning Head Pose Estimation using PyTorch
Transfer Learning for Neural Topic Models using Knowledge Distillation
Multi-class image classification of Distilling Knowledge by Mimicking Features.
VisionTransformer for Tensorflow2
Tensorflow based framework for 3D-Unet with Knowledge Distillation
This is the repository for implementation of 'Knowledge Distillation for Multi-task Learning'
Implementation of several neural network compression techniques (knowledge distillation, pruning, quantization, factorization), in Haiku.
Official code for the paper "Domain Generalization for Crop Segmentation with Knowledge Distillation"
Knowledge distillation pytorch lightning template for image classification task
Improving Question Answering Performance Using Knowledge Distillation and Active Learning
This repository includes some detailed proofs of "Bias Variance Decomposition for KL Divergence".
Pytorch-lightning framework for knowledge distillation experiments with CNN
TF 2.x implementation of Knowledge Distillation (Distilling the Knowledge in a Neural Network, NIPS 2014 Deep Learning Workshop)
Image Classification Training Framework for Network Distillation
Code for paper "Classification-based Dynamic Network for Efficient Super-Resolution"
[EMNLP 2022 (Long, Findings)] CERBERUS: Multi-head Student Model to distill knowledge in ensemble of teacher models
<WIP>
Enhanced Image Captioning on ROCO Multimodal dataset using step-by-step distillation
[IEEE TII] On-Device Saliency Prediction Based on Pseudoknowledge Distillation
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."