No response from ollama (Openshift / Kubernetes) #1039
acocalypso
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I set it up on an Openshift Cluster, Ollama and WebUI are running in CPU only mode and I can pull models, add prompts etc.
When I open a chat, select a model and ask a question its running for an eternity and I'm not getting any response.
On ollama server I see:
Webui:
my ollama container isn't using any cpu therefore I assume its not doing anything at all.
Is there anyway to troubleshoot?
Beta Was this translation helpful? Give feedback.
All reactions