Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support logprobs for ChatCompletions #533

Open
l-eugine opened this issue Apr 15, 2024 · 0 comments
Open

Support logprobs for ChatCompletions #533

l-eugine opened this issue Apr 15, 2024 · 0 comments
Assignees
Labels
enhancement New feature or request
Milestone

Comments

@l-eugine
Copy link

Here is description of the feature from OpenAI - https://cookbook.openai.com/examples/using_logprobs

Basically it requires adding two additional parameters into ChatCompletionCreateRequest

logprobs: 
    Whether to return log probabilities of the output tokens or not. 
    If true, returns the log probabilities of each output token returned in the content of message. 
    This option is currently not available on the gpt-4-vision-preview model.
top_logprobs: 
    An integer between 0 and 5 specifying the number of most likely tokens to return at each token position, 
    each with an associated log probability. logprobs must be set to true if this parameter is used.

This part can be done with existing library, by deriving from ChatCompletionCreateRequest and adding these parameters as

    public class ChatCompletionCreateRequest2 : ChatCompletionCreateRequest
    {
        /// <summary>
        ///     Whether to return log probabilities of the output tokens or not.
        /// If true, returns the log probabilities of each output token returned in the content of message.
        /// This option is currently not available on the gpt-4-vision-preview model.
        /// </summary>
        [JsonPropertyName("logprobs")]
        public bool? LogProbs { get; set; }


        /// <summary>
        /// An integer between 0 and 5 specifying the number of most likely tokens to return at each token position, each with an associated log probability.
        /// logprobs must be set to true if this parameter is used.
        /// </summary>
        [JsonPropertyName("top_logprobs")]
        public int? TopLogprobs { get; set; }
    }

But I could not find and easy way to extend parser of the response. it's hardcoded to parse server response into ChatCompletionCreateResponse, which in turn decode ChatChoiceResponse, which doesn't have a property to capture response from server

  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "NavigateParent()"
      },
      "logprobs": {
        "content": [
          {
            "token": "Navigate",
            "logprob": -6.704273e-7,
            "bytes": [
              78,
              97,
              118,
              105,
              103,
              97,
              116,
              101
            ],
            "top_logprobs": []
          },

So, suggestion is to add two optional parameters mntioned above to ChatCompletionCreateRequest
And add LogProbs property to ChatChoiceResponse (same way as it is declared in ChoiceResponse)

    [JsonPropertyName("logprobs")]
    public LogProbsResponse LogProbs { get; set; }
@kayhantolga kayhantolga self-assigned this Apr 15, 2024
@kayhantolga kayhantolga added the enhancement New feature or request label Apr 15, 2024
@kayhantolga kayhantolga added this to the 8.0.3 milestone Apr 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants