Skip to content

A frontend interface for interacting with AI Models. Compatible with Ollama and OpenAI

Notifications You must be signed in to change notification settings

harvmaster/AiChat

Repository files navigation

AI Chat (aichat)

A chat interface for OpenAI and Ollama models featuring chat streaming, local caching, and customisable model values.
OpenAI models utilise an OpenAI developer key which allows you to pay per token.
Check out the demo here

Features

  • Code highlighting on input and reponse
  • LLAVA model support (vision models)
  • Easy to share a model with just a link
  • Completely local. All your converations are stored on your browser, not on some server
  • Custom model settings
  • PWA for lightweight installation on mobile and desktop

Screenshots

FullscreenDemo

SettingsDemo

To do

  • Fix multiple root elements in template (src/components/ChatMessage/ChatmessageChunk.vue)
  • Explore continue prompt (src/components/ChatMessage/ChatMessage.vue)

Install the dependencies

yarn
# or
npm install

Start the app in development mode (hot-code reloading, error reporting, etc.)

The service can be launched in dev mode and is accessable at http://localhost:9200/#/

quasar dev

In dev mode, the HMR_PORT environment variable can be set to allow for Hot_Module_Reloading when the service is sitting behind a sub/domain.

environment:
    - HMR_PORT=443

Lint the files

yarn lint
# or
npm run lint

Format the files

yarn format
# or
npm run format

Build the app for production

quasar build

Customize the configuration

See Configuring quasar.config.js.