This repo contains a list of channels and sources from where LLMs should be learned
-
Updated
May 12, 2024
This repo contains a list of channels and sources from where LLMs should be learned
Unify Efficient Fine-Tuning of 100+ LLMs
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
✨✨Latest Papers and Datasets on Multimodal Large Language Models, and Their Evaluation.
总结Prompt&LLM论文,开源数据&模型,AIGC应用
[ICML2024] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
InternLM-XComposer2 is a groundbreaking vision-language large model (VLLM) excelling in free-form text-image composition and comprehension.
A one-stop data processing system to make data higher-quality, juicier, and more digestible for LLMs! 🍎 🍋 🌽 ➡️ ➡️🍸 🍹 🍷为大语言模型提供更高质量、更丰富、更易”消化“的数据!
Discourse chat data crawling and on-the-way parsing straight for LLM instruction finetuning. Data include texts, images and links ( Discourse论坛对话(图片,文本)数据爬取并解析,以直接用于(多模态)指令微调).
Generative Representational Instruction Tuning
This repository has a lot of LLM projects done. It is the best place to start learning LLM.
Datasets collection and preprocessings framework for NLP extreme multitask learning
DataDreamer: Prompt. Generate Synthetic Data. Train & Align Models. 🤖💤
Lightweight demos for finetuning LLMs. Powered by 🤗 transformers and open-source datasets.
Video Foundation Models & Data for Multimodal Understanding
Video-LLaVA: Learning United Visual Representation by Alignment Before Projection
A multimodal model for language-guided socially compliant robot navigation.
DialogStudio: Towards Richest and Most Diverse Unified Dataset Collection and Instruction-Aware Models for Conversational AI
Add a description, image, and links to the instruction-tuning topic page so that developers can more easily learn about it.
To associate your repository with the instruction-tuning topic, visit your repo's landing page and select "manage topics."