Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement]: add integration with litellm for observability #1603

Merged

Conversation

aybruhm
Copy link
Member

@aybruhm aybruhm commented May 2, 2024

Description

This PR allows litellm integration for observability.

Related Issue

Closes #1582

Additional Information

The example demonstrating LiteLLM integration for observability is titled "litellm_integration". This example is included in the pull request.

@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label May 2, 2024
Copy link

vercel bot commented May 2, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
agenta ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 6, 2024 2:13pm

@mmabrouk
Copy link
Member

Closes #1569

@aybruhm
Copy link
Member Author

aybruhm commented Jun 5, 2024

The input for this span is totally wrong. Let's look at how others process the data from litellm and copy it ( a note here, everyone seems to be using the response_obj while we are using kwargs, why?)

They're not. Check out this and this. The kwargs in log_pre_api_call contain the prepared data for the LLM provider. We only have access to response_obj after receiving a response from the LLM provider.

re: @mmabrouk

@mmabrouk
Copy link
Member

mmabrouk commented Jun 5, 2024

The input for this span is totally wrong. Let's look at how others process the data from litellm and copy it ( a note here, everyone seems to be using the response_obj while we are using kwargs, why?)

They're not. Check out this and this. The kwargs in log_pre_api_call contain the prepared data for the LLM provider. We only have access to response_obj after receiving a response from the LLM provider.

re: @mmabrouk

You are right. Sorry for missing that. However, the input for the span in the screenshot were wrong. It could be that the format that we are logging then is wrong (related to the issue we discussed this morning).

@aybruhm
Copy link
Member Author

aybruhm commented Jun 5, 2024

The input for this span is totally wrong. Let's look at how others process the data from litellm and copy it ( a note here, everyone seems to be using the response_obj while we are using kwargs, why?)

They're not. Check out this and this. The kwargs in log_pre_api_call contain the prepared data for the LLM provider. We only have access to response_obj after receiving a response from the LLM provider.
re: @mmabrouk

You are right. Sorry for missing that. However, the input for the span in the screenshot were wrong. It could be that the format that we are logging then is wrong (related to the issue we discussed this morning).

Yup, correct. This has been resolved.

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jun 6, 2024
@mmabrouk mmabrouk merged commit dac87e0 into main Jun 6, 2024
9 checks passed
@mmabrouk mmabrouk deleted the 1582-age-159-add-integration-with-litellm-for-observability branch June 6, 2024 18:59
@mmabrouk
Copy link
Member

mmabrouk commented Jun 6, 2024

Thanks @aybruhm ! Great work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLI enhancement New feature or request lgtm This PR has been approved by a maintainer observability SDK size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[AGE-159] Add integration with litellm for observability
2 participants