Skip to content

openlit/helm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
OpenLIT Logo

OpenTelemetry-native LLM Application Observability

Documentation | Quickstart | Python SDK

OpenLIT License Downloads GitHub Last Commit GitHub Contributors Chart Version Helm Test

Slack X

OpenLIT Banner

Introduction

OpenLIT is an OpenTelemetry-native GenAI and LLM Application Observability tool. It's designed to make the integration process of observability into GenAI projects as easy as pie – literally, with just a single line of code. Whether you're working with popular LLM Libraries such as OpenAI and HuggingFace or leveraging vector databases like ChromaDB, OpenLIT ensures your applications are monitored seamlessly, providing critical insights to improve performance and reliability.

This project proudly follows the Semantic Conventions of the OpenTelemetry community, consistently updating to align with the latest standards in observability.

💿 Installation

To install the OpenLIT chart with the release name openlit:

helm repo add openlit https://openlit.github.io/helm/

helm repo update

helm install openlit openlit/openlit

🔧 Note: If the openlit StatefulSet Pod appears in an error state after installing the OpenLIT Helm chart, it may be due to the ClickHouse setup. Allow time for ClickHouse to fully initialize, which should help the openlit pod become healthy. If issues persist, restarting the openlit pod will resolve the issue.

🚀 Getting Started post Installation

After the OpenLIT chart is successfully deployed to your Kubernetes cluster, you'll need to generate an API key that can be used by the OpenLITmetry SDKs to authenticate requests to the OpenLIT platform.

⚡️ Instrument your Application with OpenLIT

Select the SDK that matches your application's programming language and integrate LLM monitoring with just a single line of code.

Install the openlit Python SDK using pip:

pip install openlit

Add the following two lines to your application code:

import openlit

openlit.init()
Example Usage for monitoring OpenAI Usage:
from openai import OpenAI
import openlit

openlit.init()

client = OpenAI(
    api_key="YOUR_OPENAI_KEY"
)

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "What is LLM Observability",
        }
    ],
    model="gpt-3.5-turbo",
)

Refer to the openlit Python SDK repository for more advanced configurations and use cases.

Visualize and Analyze

Once you have OpenLIT SDKs set up, you can instantly get insights into how your LLM applications. Just head over to OpenLIT UI at 127.0.0.1:3000 on your browser to start exploring.

With OpenLIT, you get a simple, powerful view into important info like how much you’re spending on LLMs, which parts of your app are using them the most, and how well they’re performing. Find out which LLM models are favorites among your applications, and dive deep into performance details to make smart decisions. This setup is perfect for optimizing your app performance and keeping an eye on costs.

Configuring OpenLIT

You can adjust the OpenLIT configuration by specifying each parameter using the --set key=value argument to helm install. For example:

helm install openlit \
  --set openlit.service.type=NodePort \
  --set openlit.service.port=3000 \
  openlit/openlit

Alternatively, you can provide a YAML file that specifies the values for the required parameters while installing the chart. For example:

helm install openlit -f values.yaml openlit/openlit

Upgrading the Chart

To upgrade the openlit deployment:

helm upgrade openlit openlit/openlit

Uninstalling the Chart

To uninstall/delete the openlit deployment:

helm delete openlit

If you've made any changes to values.yaml, remember to use the -f flag to provide the updated file.

Contributing

We welcome contributions to the OpenLIT project. Please refer to CONTRIBUTING for detailed guidelines on how you can participate.

License

OpenLIT is available under the Apache-2.0 license.

Support

For support, issues, or feature requests, submit an issue through the GitHub issues associated with the OpenLIT Repository and add Helm label.