internet llm - access your ollama (or any other local llm) instance from across the internet
-
Updated
Dec 31, 2023 - Go
internet llm - access your ollama (or any other local llm) instance from across the internet
Extremely simple chat interface for ollama models.
ollama web_ui simple and easy
This script allows for the management of a server and models for the "Ollama" application. It provides features such as starting, stopping, updating, and launching various models through an interactive menu. Convenient for easily managing Ollama via the command line.
REST API proxy to Vertex AI with the interface of ollama. HTTP server for accessing Vertex AI via the REST API interface of ollama.
A frontend interface for interacting with AI Models. Compatible with Ollama and OpenAI
HTTP proxy for accessing Vertex AI with the REST API interface of ollama. Optionally forwarding requests for other models to ollama. Written in Go.
This repo contains a code that uses colabxterm and langchain community packages to install Ollama on Google Colab free tier T4 and pulls a model from Ollama and chats with it
Run multiple open source large language models concurrently powered by Ollama
比简单更简单,通过 Ollama 不需要显卡轻松在你的电脑上运行 LLM。
A simple to use Ollama autocompletion engine with options exposed and streaming functionality
Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
Add a description, image, and links to the ollama-interface topic page so that developers can more easily learn about it.
To associate your repository with the ollama-interface topic, visit your repo's landing page and select "manage topics."