Skip to content

llm-workflow-engine/lwe-plugin-provider-openrouter

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

LLM Workflow Engine (LWE) OpenRouter Provider plugin

OpenRouter Provider plugin for LLM Workflow Engine

Access to OpenRouter models.

Installation

From packages

Install the latest version of this software directly from github with pip:

pip install git+https://github.com/llm-workflow-engine/lwe-plugin-provider-openrouter

From source (recommended for development)

Install the latest version of this software directly from git:

git clone https://github.com/llm-workflow-engine/lwe-plugin-provider-openrouter.git

Install the development package:

cd lwe-plugin-provider-openrouter
pip install -e .

Configuration

Add the following to config.yaml in your profile:

plugins:
  enabled:
    - provider_openrouter
    # Any other plugins you want enabled...
  # THIS IS OPTIONAL -- By default the plugin loads all model data via an API
  # call on startup. This does make startup time longer, and the CLI completion
  # for selecting models is very long!
  # You can instead provide a 'models' object here with the relevant data, and
  # It will be used instead of an API call.
  provider_openrouter:
    models:
      # 'id' parameter of the model as it appears in the API.
      # This is also listed on the model's summary page on the OpenRouter
      # website.
      "mistralai/mixtral-8x22b-instruct":
        # The only parameter, and it's required.
        max_tokens: 65536

Usage

From a running LWE shell:

/provider openrouter
/model model_name openai/gpt-3.5-turbo

About

LLM Workflow Engine (LWE) OpenRouter Provider plugin

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages