A research bot using locally hosted LLMs, using LangChain and Ollama.
- Setup .env file with the following keys:
- SERPER_API_KEY
- BROWSERLESS_API_KEY
- Install Ollama and begin running your desired model:
ollama run mistral
- Run main.py with Streamlit
streamlit run main.py
Special Thanks & Resources