Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running without network error: ollama._types.ResponseError #85

Open
Gloridust opened this issue Mar 8, 2024 · 2 comments
Open

Running without network error: ollama._types.ResponseError #85

Gloridust opened this issue Mar 8, 2024 · 2 comments

Comments

@Gloridust
Copy link

Gloridust commented Mar 8, 2024

Really helpful project! However, I met some problem When I turn off WI-FI connection.

  • OS: Windows10 LTSC
  • cpu: R7-7840H
  • Language: Python
Traceback (most recent call last):
  File "c:\Users\gloridust\Documents\GitHub\LocalChatLLM\start.py", line 117, in <module>
    main_loop()
  File "c:\Users\gloridust\Documents\GitHub\LocalChatLLM\start.py", line 99, in main_loop
    output_text, received_message = get_response(message_history)
                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "c:\Users\gloridust\Documents\GitHub\LocalChatLLM\start.py", line 63, in get_response
    response = ollama.chat(
               ^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\ollama\_client.py", line 177, in chat        
    return self._request_stream(
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\ollama\_client.py", line 97, in _request_stream
    return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\ollama\_client.py", line 73, in _request     
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError

All of my program can work well with internet connection. But when I turn off the wifi switch, it totally error.
U can see my project at:https://github.com/Gloridust/LocalChatLLM
I really need it to run completely offline, any solutions?

@hj199717
Copy link

hj199717 commented Apr 1, 2024

遇到同样的问题了,只要网络不通就报这个错,您这边解决这个问题了吗。
可以尝试下openai的API看行不行:

from openai import OpenAI
client = OpenAI(
base_url='http://localhost:11434/v1/',
api_key='ollama', # required but ignored
)
chat_completion = client.chat.completions.create(
messages=[
{
'role': 'user',
'content': 'Say this is a test',
}
],
model='qwen:4b',
)

@Gloridust
Copy link
Author

仍然有一些问题。
首先这个openai的接口在输出格式上有一些问题,直接'print(chat_completion),你会发现输出格式如下:
There are still some issues.
First of all, this openai interface has some problems with the output format. Directly 'print(chat_completion)', you will find that the output format is as follows:

ChatCompletion(id='chatcmpl-969', choices=[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='Sure, 
I can say "This is a test" as required. Is there anything else you would like me to do in this context?\n', role='assistant', function_call=N(completion_tokens=31, prompt_tokens=0, total_tokens=31))

然后我做出了些许修改:
Then I made some changes:

print(chat_completion.choices[0].message.content)

至此,在联网状况下正常运行,但是一旦断开网络链接,就出现了更多问题:
At this point, it runs normally when connected to the Internet, but once the network connection is disconnected, more problems occur:

Traceback (most recent call last):
  File "c:\Users\gloridust\Documents\GitHub\LocalChatLLM\test\test_openai-API.py", line 8, in <module>
    chat_completion = client.chat.completions.create(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_utils\_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\resources\chat\completions.py", line 667, in create
    return self._post(
           ^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1213, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 902, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 978, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1026, in _retry_request   
    return self._request(
           ^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 978, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 1026, in _retry_request   
    return self._request(
           ^^^^^^^^^^^^^^
  File "C:\Users\gloridust\AppData\Local\Programs\Python\Python312\Lib\site-packages\openai\_base_client.py", line 993, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 502

貌似是OpenAI的接口需要联网。在创建chat completion时出现了内部服务器错误(Error code: 502)。这意味着在尝试与OpenAI服务器通信时出现了问题。
It seems that the OpenAI interface needs to be connected to the Internet. An internal server error occurred while creating the chat completion (Error code: 502). This means there is a problem while trying to communicate with the OpenAI server.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants