Replies: 1 comment
-
This is for setting the host of the internal LiteLLM running inside WebUI, not for connecting to an external LiteLLM. Just add it as an "OpenAI" connection. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When running litellm in docker it won't connect because my base url is not localhost
Setting LITELLM_PROXY_HOST doesnt work,
Culprit:
https://github.com/open-webui/open-webui/blame/90503be2edef1a1f7ce2074286b6316d5cb8868a/backend/apps/litellm/main.py#L224
https://github.com/open-webui/open-webui/blame/90503be2edef1a1f7ce2074286b6316d5cb8868a/backend/apps/litellm/main.py#L332
Possible solution:
f"{LITELLM_PROXY_HOST}:{LITELLM_PROXY_PORT}/v1"
Beta Was this translation helpful? Give feedback.
All reactions