You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Confirm this is a feature request for the Python library and not the underlying OpenAI API.
This is a feature request for the Python library
Describe the feature or improvement you're requesting
I'm unsure whether this should be a feature request or a bug report. But the behavior of the stream's usage in cases where a stream is cancelled is unclear. That is, there are many scenarios in which the response is discarded before it's completed, such as an interruption in a conversation. In these contexts, the stream can be cancelled using
However, in these cases, I'm unable to read the usage of the stream. How should this be interpreted?
From a developer's standpoint, using the library, it would be more convenient if the usage is available in the response once the response is completed, rather than having to subscribe to the chunks once they're cancelled. That is, stream.response.usage.
Additional context
No response
The text was updated successfully, but these errors were encountered:
Confirm this is a feature request for the Python library and not the underlying OpenAI API.
Describe the feature or improvement you're requesting
I'm unsure whether this should be a feature request or a bug report. But the behavior of the stream's usage in cases where a stream is cancelled is unclear. That is, there are many scenarios in which the response is discarded before it's completed, such as an interruption in a conversation. In these contexts, the stream can be cancelled using
However, in these cases, I'm unable to read the usage of the stream. How should this be interpreted?
From a developer's standpoint, using the library, it would be more convenient if the usage is available in the response once the response is completed, rather than having to subscribe to the chunks once they're cancelled. That is,
stream.response.usage.
Additional context
No response
The text was updated successfully, but these errors were encountered: