The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
-
Updated
May 15, 2024 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
Ollama client for Swift
HTTP API for Nano Bots: small, AI-powered bots that can be easily shared as a single file, designed to support multiple providers such as Cohere Command, Google Gemini, Maritaca AI MariTalk, Mistral AI, Ollama, OpenAI ChatGPT, and others, with support for calling tools (functions).
💬 Discord AI chatbot using Ollama with the new Ollama API
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
Implements a simple REPL chat with a locally running instance of Ollama.
A package manager for Go
A simple and easy to use library for interacting with the Ollama API in Delphi
GPTAggregator is a Python-based application that provides a unified interface to interact with various large language models (LLMs) through their respective APIs. The project aims to simplify the process of working with different LLM providers.
A command line tool for journaling daily accomplishments and summarizing them to create a bragging document.
An Ollama client made with GTK4 and Adwaita
ollama web_ui simple and easy
Module to integrate LLM models in any workflow
Add a description, image, and links to the ollama-api topic page so that developers can more easily learn about it.
To associate your repository with the ollama-api topic, visit your repo's landing page and select "manage topics."