Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.
-
Updated
Dec 18, 2023 - Makefile
Guide to self-hosting AI models using Traefik on a home network, offering cost-effective and controlled alternatives to cloud-based services.
Extremely simple chat interface for ollama models.
Odin Runes, a java-based GPT client, liberates you from vendor lock-in, allowing seamless interaction with your preferred GPT model right through your favorite text editor. There is more: It also facilitates prompt-engineering by extracting context from diverse sources using technologies such as OCR, enhancing overall productivity and saving costs.
Your gateway to both Ollama & Apple MlX models
Spring break project for easier access to 'ollama' language models.
one chat UI for ollama
Automated (unofficial) Docker Hub mirror of tagged images on open-webui's GHCR repo
docker compose to load ollama, flowise, langfuse, open-web-ui
A simple interface for interacting with LLMs via a local installation of Ollama
一个通过ollama API与本地LLMs聊天的小工具(a web application for chatting with local LLMs by ollama API)
Ollama Chat is a GUI for Ollama designed for macOS.
Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
Desktop UI for Ollama made with PyQT
ollama web_ui simple and easy
Simple web UI for Ollama
Witsy: desktop AI assistant
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
An Ollama client made with GTK4 and Adwaita
Add a description, image, and links to the ollama-gui topic page so that developers can more easily learn about it.
To associate your repository with the ollama-gui topic, visit your repo's landing page and select "manage topics."