Skip to content

deadbits/moce

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

16 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

moce ๐ŸŒบ๐Ÿค–

Simple, fully local retrieval-augmented-generation powered by Ollama, Embedchain, and Chainlit.

chainlit ui

Setup

Install Ollama

Download and install Ollama

The embedchain config uses dolphin-mixtral by default, but you can swap this out for any other model.

Clone the repository

git clone https://github.com/deadbits/moce.git
cd moce

Setup Python virtual environment

python3 -m venv venv
source venv/bin/activate

Install Python requirements

pip install -r requirements.txt

Set Hugging Face API token

This is required during the first run to download the embedding model.

export HUGGINGFACE_API_TOKEN="hf_..."

Start ChromaDB Docker

docker pull chromadb/chroma
docker run -d -p 8000:8000 chromadb/chroma

Run

chainlit run moce.py --port 8888

Chat Commands

command action
/add add new document to the knowledge base
/kb return last 25 documents added to knowledge base
/help display this table
* all other input is chat

Add Data

You can start a conversation by asking a question or sharing a document with the /add command.

Add data to knowledge base

/add https://huggingface.co/blog/shivance/illustrated-llm-os

View KB

Document names added to your knowledge base are tracked in data/indexed.json. The /kb command will return the last 25 document names.

About

Local retrieval-augmented-generation with Mixtral, Ollama, Chainlit, and Embedchain ๐ŸŒบ๐Ÿค–

Topics

Resources

Stars

Watchers

Forks

Languages