A personal knowledge 🧠 base used to distil knowledge in to a atomic document 📄 using logseq
-
Updated
May 26, 2024 - CSS
A personal knowledge 🧠 base used to distil knowledge in to a atomic document 📄 using logseq
A curated list for Efficient Large Language Models
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
[CVPR 2024] Official PyTorch Code for "PromptKD: Unsupervised Prompt Distillation for Vision-Language Models"
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
A treasure chest for visual classification and recognition powered by PaddlePaddle
Awesome Knowledge Distillation
A curated list of awesome NLP, Computer Vision, Model Compression, XAI, Reinforcement Learning, Security etc Paper
Deep Multimodal Guidance for Medical Image Classification: https://arxiv.org/pdf/2203.05683.pdf
Official PyTorch Code for "Dynamic Temperature Knowledge Distillation"
Full Wiki enables seamless access to Wikipedia content in multiple languages. It translates English Wikipedia the most comprensive knowledge base into other languages. The user do not need to know the translated search term. This project should be a concept of how LLMs will tear down language barriers.
Distill knowledge from in-context learning into efficient LoRA adapters, enabling expert LLM performance with smaller context windows.
[AAAI 2023] Official PyTorch Code for "Curriculum Temperature for Knowledge Distillation"
AI book for everyone
Code for CVPR'24 Paper: Segment Any Event Streams via Weighted Adaptation of Pivotal Tokens
[CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation
[CVPR 2024] Source code for "Diffusion-Based Adaptation for Classification of Unknown Degraded Images".
模型压缩的小白入门教程
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."