Replies: 3 comments 3 replies
-
If I had to make a guess it seems like there was an issue parsing the config.yaml file, could you share them with us? |
Beta Was this translation helpful? Give feedback.
1 reply
-
a fresh new install encounter same error. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Bug Report
Description
Bug Summary:
after upgrade to v0.1.123 using
docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui
, i could not use litellm.Steps to Reproduce:
docker run --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 -e http_proxy=http://172.24.235.74:7890 -e https_proxy=http://172.24.235.74:7890 -e no_proxy=127.0.0.1,localhost --name open-webui --restart always ghcr.io/open-webui/open-webui:main
Environment
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Beta Was this translation helpful? Give feedback.
All reactions