Module to integrate LLM models in any workflow
-
Updated
Mar 21, 2024 - TypeScript
Module to integrate LLM models in any workflow
Trying to do some Data Science with OpenAI and LLMs.
This acts like WebChatGPT and forwards your searches to Ollama which runs locally.
Creating the test beds with the help of chatgpt, in house LLM OLLAMA and static, diverse programs
REST API proxy to Vertex AI with the interface of ollama. HTTP server for accessing Vertex AI via the REST API interface of ollama.
This is a Python simple game developed using Pygame, where the red square (controlled by an AI) tries to escape from the black square (controlled by the player). The AI believes the player is an enemy and uses four commands - left, right, up, down - to flee from it.
Buni is a TypeScript-based client API for Ollama, designed to be simple yet flexible.
huginn agent to interact with ollama api
LLMs' performance analysis on CPU, GPU, Execution Time and Energy Usage
PDF to text converter to chat with an AI, (ChatGPT, Ollama)
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
ollama web_ui simple and easy
This is small example for consumer api ollama server. This wants be one Gits for describe a client Ollama.
the open source repository for Mistral AI Discord Bot
REST API proxy to Vertex AI with the interface of ollama. HTTP server for accessing Vertex AI via the REST API interface of ollama. Written in Go.
A user-interface for chatting with multiple LLMs. Currently in development.
Add a description, image, and links to the ollama-api topic page so that developers can more easily learn about it.
To associate your repository with the ollama-api topic, visit your repo's landing page and select "manage topics."