Streaming wwwith "chat.completions.create" endpoint produces a httpx.RemoteProtocolError #1384
Closed
1 task done
Labels
bug
Something isn't working
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
base_url: links fastchat with the openai API
LLM for inference: "mistralai/Mixtral-8x7B-Instruct-v0.1"
When I try to stream chunks of generated text I get this error:
Note when I dont stream
stream = False
I don't get any errors and I get my inference.To Reproduce
Simply run the code below, using the LLM I used, as well as python and openai versions and so on.
Code snippets
OS
linux
Python version
Python 3.11.8
Library version
openai v.1.24.0
The text was updated successfully, but these errors were encountered: