中文nlp解决方案(大模型、数据、模型、训练、推理)
-
Updated
May 23, 2024 - Python
中文nlp解决方案(大模型、数据、模型、训练、推理)
🔥 Korean GPT-2, KoGPT2 FineTuning cased. 한국어 가사 데이터 학습 🔥
Generative models nano version for fun. No STOA here, nano first.
Atividades da disciplina IA024 - Redes Neurais Profundas para Processamento de Linguagem Natural, FEEC-Unicamp, 1s2024
Evaluating Transformer fine-tuning on literary datasets, using GPT-2 model.
Various LMs/LLMs below 3B parameters (for now) trained using SFT (Supervised Fine Tuning) for several downstream tasks
Auto generate tweets powered by pre-trained GPT2 based large language model (LLM) available offline.
A simple CLI chat mode framework for local GPT-2 Tensorflow models
The script continuously listens for voice commands, processes them, and executes the corresponding actions based on predefined commands or generates responses using the GPT-2 model if the command is not recognized.
Generative Pretrained Model (GPT) in JAX. A step by step guide to train LLMs on large datasets from scratch
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
Fine-Tuning Text Pre-Trained Transformers to In-Context Learn Simple Function Classes.
A Python-based chatbot project built on the autogen and tinygrad foundation, utilizing advanced agents for dynamic conversations and function orchestration, enhancing and expanding traditional chatbot capabilities.
Fine-tuned GPT2 to generate Indian recipes based on few listed ingredients.
A Streamlit App for running Text Analysis Models
Efficient protein de novo design pipeline with GPT-based generator and transfer learning-based discrminator
MindSpore online courses: Step into LLM
Add a description, image, and links to the gpt2 topic page so that developers can more easily learn about it.
To associate your repository with the gpt2 topic, visit your repo's landing page and select "manage topics."