You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am running Open-webui and Ollama on Windows 11. Ollama operates within a Conda environment and functions correctly with my Python code, for example, using ollama.chat or within the CLI.
I then initiate Open-webui using Docker with the following command, and it starts up normally:
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
However, when I attempt to ask a question, I receive a message stating there is no selected model. Upon entering the settings screen and checking the "connections" tab, everything appears to be functioning correctly, and I can see Ollama responding (Ollama's log returns a 127.0.0.1 GET response each time I hit the refresh icon). Nonetheless, when I access the "models" tab, I encounter an error stating "Server connection failed." Here is a screenshot for reference:
I checked the error log in Docker and encountered the following errors, which appear to be related to "models not found." I'm wondering if this issue could be associated with the Conda environment, or if it might be related to the OLLAMA_MODELS environment parameter that I set in Windows for relocating the models to my drive.
2024-05-04 21:55:31 Traceback (most recent call last):
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
2024-05-04 21:55:31 result = await app( # type: ignore[func-returns-value]
2024-05-04 21:55:31 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
2024-05-04 21:55:31 return await self.app(scope, receive, send)
2024-05-04 21:55:31 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
2024-05-04 21:55:31 await super().__call__(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
2024-05-04 21:55:31 await self.middleware_stack(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
2024-05-04 21:55:31 raise exc
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
2024-05-04 21:55:31 await self.app(scope, receive, _send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
2024-05-04 21:55:31 with collapse_excgroups():
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
2024-05-04 21:55:31 self.gen.throw(typ, value, traceback)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
2024-05-04 21:55:31 raise exc
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 260, in wrap
2024-05-04 21:55:31 await func()
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 217, in stream_response
2024-05-04 21:55:31 return await super().stream_response(send)
2024-05-04 21:55:31 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 249, in stream_response
2024-05-04 21:55:31 async for chunk in self.body_iterator:
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 181, in body_stream
2024-05-04 21:55:31 raise app_exc
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
2024-05-04 21:55:31 await self.app(scope, receive_or_disconnect, send_no_error)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in __call__
2024-05-04 21:55:31 await self.app(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
2024-05-04 21:55:31 with collapse_excgroups():
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
2024-05-04 21:55:31 self.gen.throw(typ, value, traceback)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
2024-05-04 21:55:31 raise exc
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 260, in wrap
2024-05-04 21:55:31 await func()
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 217, in stream_response
2024-05-04 21:55:31 return await super().stream_response(send)
2024-05-04 21:55:31 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 249, in stream_response
2024-05-04 21:55:31 async for chunk in self.body_iterator:
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 181, in body_stream
2024-05-04 21:55:31 raise app_exc
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
2024-05-04 21:55:31 await self.app(scope, receive_or_disconnect, send_no_error)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
2024-05-04 21:55:31 await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
2024-05-04 21:55:31 raise exc
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
2024-05-04 21:55:31 await app(scope, receive, sender)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 758, in __call__
2024-05-04 21:55:31 await self.middleware_stack(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 778, in app
2024-05-04 21:55:31 await route.handle(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 487, in handle
2024-05-04 21:55:31 await self.app(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
2024-05-04 21:55:31 await super().__call__(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
2024-05-04 21:55:31 await self.middleware_stack(scope, receive, send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
2024-05-04 21:55:31 raise exc
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
2024-05-04 21:55:31 await self.app(scope, receive, _send)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
2024-05-04 21:55:31 with collapse_excgroups():
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
2024-05-04 21:55:31 self.gen.throw(typ, value, traceback)
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
2024-05-04 21:55:31 raise exc
2024-05-04 21:55:31 File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
2024-05-04 21:55:31 response = await self.dispatch_func(request, call_next)
2024-05-04 21:55:31 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-04 21:55:31 File "/app/backend/apps/ollama/main.py", line 77, in check_url
2024-05-04 21:55:31 await get_all_models()
2024-05-04 21:55:31 File "/app/backend/apps/ollama/main.py", line 159, in get_all_models
2024-05-04 21:55:31 app.state.MODELS = {model["model"]: model for model in models["models"]}
2024-05-04 21:55:31 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-05-04 21:55:31 File "/app/backend/apps/ollama/main.py", line 159, in <dictcomp>
2024-05-04 21:55:31 app.state.MODELS = {model["model"]: model for model in models["models"]}
2024-05-04 21:55:31 ~~~~~^^^^^^^^^
2024-05-04 21:55:31 KeyError: 'model'
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am running Open-webui and Ollama on Windows 11. Ollama operates within a Conda environment and functions correctly with my Python code, for example, using ollama.chat or within the CLI.
I then initiate Open-webui using Docker with the following command, and it starts up normally:
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
However, when I attempt to ask a question, I receive a message stating there is no selected model. Upon entering the settings screen and checking the "connections" tab, everything appears to be functioning correctly, and I can see Ollama responding (Ollama's log returns a 127.0.0.1 GET response each time I hit the refresh icon). Nonetheless, when I access the "models" tab, I encounter an error stating "Server connection failed." Here is a screenshot for reference:
I checked the error log in Docker and encountered the following errors, which appear to be related to "models not found." I'm wondering if this issue could be associated with the Conda environment, or if it might be related to the OLLAMA_MODELS environment parameter that I set in Windows for relocating the models to my drive.
Beta Was this translation helpful? Give feedback.
All reactions