A place to evaluate public models
-
Updated
May 23, 2024 - Python
A place to evaluate public models
Awesome Knowledge Distillation
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty
Raspberry PI and Arduino/ESP32 powered smart still controller system. Designed around the Still Spirits T-500 column and boiler, but can be easily added to any other gas or electric still with a dephlegmator.
PaddleSlim is an open-source library for deep model compression and architecture search.
The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""
irresponsible innovation. Try now at https://chat.dev/
Insightface Keras implementation
Distibot (DISTIller roBOT) is a Python program for Raspberry Pi (Raspbian) to control a whole process of distillation
[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
nllb-200 distilled 350M for English to Korean translation
[AAAI 2024] MESED: A Multi-modal Entity Set Expansion Dataset with Fine-grained Semantic Classes and Hard Negative Entities
(Interspeech 2023 & ICASSP 2024) Official repository for ARMHuBERT and STaRHuBERT
Papers and Book to look at when starting AGI 📚
A list of papers, docs, codes about efficient AIGC. This repo is aimed to provide the info for efficient AIGC research, including language and vision, we are continuously improving the project. Welcome to PR the works (papers, repositories) that are missed by the repo.
DINOv1 implementation in Pytorch
Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.
Python code to implement LLM4Teach, a policy distillation approach for teaching reinforcement learning agents with Large Language Model
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."