Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX: fix the temperature value of ollama model #4027

Merged
merged 1 commit into from May 15, 2024

Conversation

Yash-1511
Copy link
Contributor

Description

set the maximum temperature value of 1 in ollama models.

Fixes # (issue)

Type of Change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update, included: Dify Document
  • Improvement, including but not limited to code refactoring, performance optimization, and UI/UX improvement
  • Dependency upgrade

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

  • TODO

Suggested Checklist:

  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • My changes generate no new warnings
  • I ran dev/reformat(backend) and cd web && npx lint-staged(frontend) to appease the lint gods
  • optional I have made corresponding changes to the documentation
  • optional I have added tests that prove my fix is effective or that my feature works
  • optional New and existing unit tests pass locally with my changes

@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Apr 30, 2024
@takatost
Copy link
Collaborator

takatost commented May 1, 2024

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

@Yash-1511
Copy link
Contributor Author

Yash-1511 commented May 1, 2024

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion.

openai community discussion
ollama issue
ollama issue
if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.

@takatost
Copy link
Collaborator

takatost commented May 2, 2024

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion.

openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.

You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.

@Yash-1511
Copy link
Contributor Author

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion.
openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.

You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.

I think we need to set temperature range 0-2 in official open ai provider also.

docs of openai
This is the from open ai documentation.
What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

@takatost
Copy link
Collaborator

takatost commented May 4, 2024

Where did you get the range of values from? As far as I know, the temperature parameter for most models ranges from 0 to 2.

I have seen temperature values of most of the large language model ranges between 0 to 1. but after your feedback i have done some research more. i have found some useful discussion and links. but there are lots of confusion.
openai community discussion ollama issue ollama issue if we consider temperature 0 to 2 in ollama model then we need to change also in open ai models too. based on these ollama issues and discussion i found that ollama using range 0 to 2 and Ollama should divide temperature by 2 before processing the prompt. so this PR is not worth it but yes we need to adjust the open ai model parameter values and increase the temperature to range 2.

You're right, that's why we set the temperature range in our OpenAI Compatible API provider to 0-2, while OpenAI provider sets it to 0-1. We follow the official OpenAI provider's range, while other providers compatible with OpenAI API have different ranges for temperature, so we adjusted it to 0-2.

I think we need to set temperature range 0-2 in official open ai provider also.

docs of openai This is the from open ai documentation. What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

Oops, my bad, OpenAI updated the range of temperature values, I didn't notice that.
Would you mind helping me adjust the range of temperature values for OpenAI?

@takatost takatost self-requested a review May 4, 2024 05:52
@Yash-1511
Copy link
Contributor Author

Sure

@Yash-1511
Copy link
Contributor Author

Hello @takatost ,

Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses.
image

when i decrease the temperature below one then it will answer what i want.
image

and for adjusting temperature i follow this simple steps.

parameter_rules:
  - name: temperature
    use_template: temperature
    max: 2

Also for testing i update the version of openai but it will not work as expected. do you have any idea?

@takatost
Copy link
Collaborator

Hello @takatost ,

Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses. image

when i decrease the temperature below one then it will answer what i want. image

and for adjusting temperature i follow this simple steps.

parameter_rules:
  - name: temperature
    use_template: temperature
    max: 2

Also for testing i update the version of openai but it will not work as expected. do you have any idea?

Yeah, we tested it too. Once the temperature goes over 1, all sorts of garbled characters start showing up. Just as expected. 😅

And, what issues might occur if updating to the latest version of OpenAI SDK?

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label May 15, 2024
@crazywoola crazywoola merged commit 332baca into langgenius:main May 15, 2024
7 checks passed
@Yash-1511
Copy link
Contributor Author

Hello @takatost ,
Sorry for late response, i have adjusted open ai max temperature to 2. but when i tried and asking some question it generating some wierd responses. image
when i decrease the temperature below one then it will answer what i want. image
and for adjusting temperature i follow this simple steps.

parameter_rules:
  - name: temperature
    use_template: temperature
    max: 2

Also for testing i update the version of openai but it will not work as expected. do you have any idea?

Yeah, we tested it too. Once the temperature goes over 1, all sorts of garbled characters start showing up. Just as expected. 😅

And, what issues might occur if updating to the latest version of OpenAI SDK?

Same issue after updating OpenAI Sdk also.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm This PR has been approved by a maintainer size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants