-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bad service name resolution importing model in k8s #19
Comments
Hi @angelocorreia27, am I correct in assuming that you used the built-in Ollama config with |
Thank you for taking the time to analyze the issue. I am indeed using the built-in Ollama config with ollama.enabled = true. However, I encountered a problem when deploying via ArgoCD. ArgoCD uses the metadata.name field from the manifest to create the service name. Since your manifest expects the service name for Ollama to be open-webui-ollama, this naming convention causes a conflict in my ArgoCD deployment setup. Here is my current values.yaml configuration: To work around this issue, I deployed Ollama and the OpenWebUI separately. After deploying them independently, I defined the Ollama endpoint within the WebUI configuration. Thank you. |
Thanks for the extra context. I think in the case of your ArgoCD deployment, you'd want to update your In the case of using a separate Ollama backend (which is the setup I use as well), you can also opt to update the |
Sure. Thank you. |
I found that this was a legitimate issue that resulted from one of the helper values in our chart trying to define a service name in the Open WebUI deployment while the Ollama chart's Thanks again for reporting it, and sorry I didn't catch it before. |
Action: Importing modelfiles
Error:
File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 519, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='open-webui-ollama.xxxx.svc.cluster.local', port=11434): Max retries exceeded with url: /api/create (Caused by NameResolutionError("<urllib3.connection.HTTPConnection object at 0x7f5cef705e50>: Failed to resolve 'open-webui-ollama.xxxx.svc.cluster.local' ([Errno -2] Name or service not known)"))
INFO: 10.32.0.1:0 - "POST /ollama/api/create HTTP/1.1" 500 Internal Server Error
kubernetes version: 1.27.2
It seems like the charts are assuming 'open-webui-ollama' as the name of the Ollama service and do not recognize a different name.
The text was updated successfully, but these errors were encountered: