Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

response error when calling any ollama functions #88

Open
Hansyvea opened this issue Mar 11, 2024 · 3 comments
Open

response error when calling any ollama functions #88

Hansyvea opened this issue Mar 11, 2024 · 3 comments

Comments

@Hansyvea
Copy link

I have ollama service run in the background and it is working well to run any model in ternimal.
However, when it comes to python, things happend.

import ollama
ollama.list()

site-packages/ollama/_client.py:71) response.raise_for_status()
site-packages/ollama/_client.py:72) except httpx.HTTPStatusError as e:
--->site-packages/ollama/_client.py:73) raise ResponseError(e.response.text, e.response.status_code) from None
site-packages/ollama/_client.py:75) return response

@mxyng
Copy link
Collaborator

mxyng commented Mar 19, 2024

Are you using the standard ollama host/port, e.g. 127.0.0.1:11434? If not, you will need to set OLLAMA_HOST or ollama.Client(host='...')

@CufeDigitalEcon
Copy link

try turn off your VPN app

@edgemoorlf
Copy link

try turn off your VPN app

Thanks, buddy. You made my day!

Wondering why? REST call to http://127.0.0.1:11434/api/embeddings was ok on the same machine.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants