Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

custom_llm_provider is not working with acompletion, but is working with completion #3481

Closed
wants to merge 0 commits into from

Conversation

mrT23
Copy link

@mrT23 mrT23 commented May 6, 2024

Addresses the bug described here:
#3480

I am not sure I understand the logic there and why 'completion' and 'acompletion' have a different treatment for the custom_llm_provider parameter, but this solves this specific issue

Copy link

vercel bot commented May 6, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 6, 2024 4:17pm

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @mrT23 thanks for the PR - please can you add a test for this scenario

@mrT23
Copy link
Author

mrT23 commented May 6, 2024

Hi @mrT23 thanks for the PR - please can you add a test for this scenario

I am not sure I have the technical understanding in litellm to do that.
setting up a real vllm server for testing is hard, and mocking can also be complicated if you don't know every last detail.

@ishaan-jaff
Copy link
Contributor

I am not sure I have the technical understanding in litellm to do that.

I believe editing one of our existing tests would work :

response = await litellm.acompletion(model="claude-2.1", messages=messages)

If you pass custom_llm_provider=anthropic there ^ it would catch this scenario right @mrT23 ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants