localllm
Here are 21 public repositories matching this topic...
Read your local files and answer your queries
-
Updated
Feb 14, 2024 - Python
Local AI Open Orca For Dummies is a user-friendly guide to running Large Language Models locally. Simplify your AI journey with easy-to-follow instructions and minimal setup. Perfect for developers tired of complex processes!
-
Updated
Mar 5, 2024 - Python
A generalized information-seeking agent system with Large Language Models (LLMs).
-
Updated
Apr 19, 2024 - Python
KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
-
Updated
Apr 19, 2024 - Python
ScrAIbe Assistant is designed to leverage Whisper for precise audio processing and local LLMs via Ollama for efficient summarization. This tool is perfect for tasks such as taking notes from team meetings or lectures, offering a secure environment where no data—be it text, audio, or otherwise—leaves your local machine.
-
Updated
Apr 21, 2024 - Python
This projects build a local retrieval augmented generation (pipeline) from scratch, connects it to a local llm, and is deployed as a chatbot via Gradio.
-
Updated
Apr 25, 2024 - Jupyter Notebook
Local AI Search assistant web or CLI for ollama and llama.cpp. Lightweight and easy to run, providing a Perplexity-like experience.
-
Updated
Apr 26, 2024 - Python
[ICML 2024] SqueezeLLM: Dense-and-Sparse Quantization
-
Updated
May 2, 2024 - Python
MVP of an idea using multiple LLM models to simulate and play D&D (Local LLM via ollama support + together.ai API support)
-
Updated
May 17, 2024 - Python
Use your locally running AI models to assist you in your web browsing
-
Updated
Jun 3, 2024 - TypeScript
Run gguf LLM models in Latest Version TextGen-webui
-
Updated
Jun 3, 2024 - Jupyter Notebook
Improve this page
Add a description, image, and links to the localllm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the localllm topic, visit your repo's landing page and select "manage topics."