Skip to content
#

self-distillation

Here are 14 public repositories matching this topic...

This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.

  • Updated Apr 12, 2024

Improve this page

Add a description, image, and links to the self-distillation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the self-distillation topic, visit your repo's landing page and select "manage topics."

Learn more