A notebook that runs GPT-Neo with low vram (6 gb) and cuda acceleration by loading it into gpu memory in smaller parts.
-
Updated
May 25, 2021 - Jupyter Notebook
A notebook that runs GPT-Neo with low vram (6 gb) and cuda acceleration by loading it into gpu memory in smaller parts.
A basic ui for running gpt neo 2.7B on low vram (3 gb Vram minimum)
[WIP] WisdumbAI: Generate thoughts/tweets using GPT-Neo.
Few Shot Learning using EleutherAI's GPT-Neo an Open-source version of GPT-3
Colab notebooks to run a basic AI Dungeon clone using gpt-neo-2.7B
Evaluating the quality of Automatic Code Generation and Recommendation tools, e.g. GitHub Copilot
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Tool to generate lorem ipsum-style Insights for Insights Explorer
Hebrew text generation models based on EleutherAI's gpt-neo. Each was trained on a TPUv3-8 made avilable via TPU Research Cloud Program.
📝 Amazon product description generator using GPT-Neo for Texta.ai
This repository contains various experiments and prototypes to get use to working with GPT-like models and being creative with them.
Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression
Text Generator for Amazon Ads. Use Natural Language Generation (NLG) technology to auto-generate text. Fine-tuning of pre-trained gpt-neo models to improve upon the RNN LSTM model
Auto-generate an entire paper from a prompt or abstract using NLP
Automate GPT3 website-login.
Create soft prompts for fairseq 13B dense, GPT-J-6B and GPT-Neo-2.7B for free in a Google Colab TPU instance
Add a description, image, and links to the gpt-neo topic page so that developers can more easily learn about it.
To associate your repository with the gpt-neo topic, visit your repo's landing page and select "manage topics."