Skip to content

teftef6220/Local_LLM_bot

Repository files navigation

Voice Agent Use Local LLM and TTS

Youtube Demo Here! ⇩

mafuyu_Voice.mp4

Japanese Demo Her! ⇩

Final.mp4

Introduction

This is a sample project to demonstrate how to use local LLM and TTS in Voice Agent. Using rinna/japanese-gpt-neox-3.6b-instruction-sft as LLM and Style-Bert-Vits-2 as TTS.

Installation

1. Clone the repository

git clone https://github.com/teftef6220/Local_LLM_bot.git
cd Local_LLM_bot

2. Make Venv

python -m venv venv
source venv/bin/activate

3. Install requirements

Install pytorch and torchaudio and torchvision from official website

pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118

Install cudnn from official website

Install another requirements

pip install -r requirements.txt

4. Fine-tune LLM

Fine-tune LLM with your data.

Use Colab notebook to fine-tune LLM. You can use this notebook.

this notebook

You can use

as a pretrained model.

5. Fine-tune TTS

Fine-tune TTS with your data. You can use Style-Bert-Vits-2 as a pretrained model.

Open In Colab

6. Set up Voice Agent

Put your fine-tuned LLM dir in llm_models directory

llm_models
    |
    |---model_instance_dir
        |
        |---adapter_model.bin
        |---adapter_model.json

and put your fine-tuned TTS dir in Voice_models directory.

Voice_models
    |
    |---model_name
        |
        |---model_name_e100_s2000.safetensors
        |---config.json
        |---style_vectors.npy

7. Run Voice Agent

Set config in all_config.py

python llm_agent.py

8. Bluesky bot

Also provided a simple Bluesky bot that uses the Local LLM. You can run it with the following command.

8.1. Use Bluesky bot

you can mention the bot and the bot will reply to your mention.like

@latextex.bsky.social ねえ、名前教えてよ

set .env file as below

BS_USER_NAME = "your email address"
BS_PASSWORD = "your password"

and run the bot with the following command.

python blue_sky_bot.py

this bot can detect mentions and reply to your mentions use LLM.

Credits

License

This project is licensed under the Affero General Public License v3.0 - see the LICENSE file for details.