This project serves as a basic example of integrating Chainlit and LangChain with the Mistral Large Language Model (LLM). Ensure that the data/test.pdf
file contains the necessary context for the LLM. Additionally, you may want to customize the PROMPT_TEMPLATE
to suit your needs. 🙃
preview.mp4
To start the project, install all dependencies in the requirements section.
Download mistral:instruct
model (if needed):
ollama pull mistral:instruct
Install Python dependencies:
pip install -r requirements.txt
And finally, start the server:
chainlit run app.py -w -h
That's it! 🎉 Now visit http://localhost:8000 to play with it! 😏