Static and animated distillation phase diagrams for chemistry education
-
Updated
Jan 14, 2021 - Jupyter Notebook
Static and animated distillation phase diagrams for chemistry education
VisionTransformer for Tensorflow2
A tutorial on how to prune the embedding layer of a language model and crafting a suitable tokenizer
Chemical Engineering application: Distillation calculator for McCabe-Thiele and Ponchon-Savarit methods. https://apguilherme.github.io/Distillation/
A template for use in creating Autodistill Target Model packages.
PyTorch implementation of various distillation approaches for continual learning of Diffusion Models.
Distillation and some other iterative methods for fastText.
This is a fork of the distilling-step-by-step repository with the aim of creating a task-specific LLM distillation framework for healthcare.
code for our paper DistilALHuBERT: A Distilled Parameter Sharing Audio Representation Model
MATLAB program that models a binary flash distillation column by calculating vapor-liquid equilibrium using the Antoine equation. Determines liquid and vapor product flow rates, compositions, temperatures based on given feed conditions like pressure, temperature, composition. Plots a T-x-y diagram.
A list of papers, docs, codes about diffusion distillation.This repo collects various distillation methods for the Diffusion model. Welcome to PR the works (papers, repositories) missed by the repo.
Computer Science 791-025: Real-Time AI & High-Performance Machine Learning
Efficient Inference techniques implemented in PyTorch for computer vision.
[ICCV 2019] A Comprehensive Overhaul of Feature Distillation
Implementation code of GKD: Semi-supervised Graph Knowledge Distillation for Graph-Independent Inference accepted by Medical Image Computing and Computer Assisted Interventions (MICCAI 2021)
Distillation of GANs with fairness constraints
Model Distillation for Unlabeled and Imbalanced Data for Amino-Acid-Strings
Effective Knowledge Distillation Generalization for Language Models
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."