Skip to content

dmeldrum6/LocalLLMChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

LocalLLMChat

Single HTML Page access to an OpenAI API compatible Local LLM local_chat

I have enjoyed running local LLMs using LM Studio, but wanted a simple way to be able to access them running on one computer at home from other machines without having to put up some kind of server framework.

This is a single webpage that uses JavaScript for the interaction with the OpenAI compatible API. Just edit the file and change the IP address from "Localhost" to whatever makes sense for you.

It implements streaming replies from the API so you don't need to wait for the entire answer to be rendered. As it is meant to be run serverless it does not have a lot of native features.

Generate - Passes prompt to the API endpoint and begins displaying response.
Stop - Cancels request and clears context
Clear Context - clears context and starts a new conversation.

About

Serverless single HTML page access to an OpenAI API compatible Local LLM

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages