New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: LLM Provider NOT provided #1382
Comments
Can you please check this value "GPT4-1106"? You could just try 'GPT4-1106-preview' quick, or check the deployments page in the Azure account? |
In your Azure account, there's a "deployments" page/tab I think, where you can see the names of your deployments. It's that name you need for the chat model. However, I need to add a detail: if it's different than the default model name, which it might be, then:
|
Please make sure to open the web UI, and in Settings, enter the model and save. Even if you sent it as parameter, save it in the UI. Does it work? |
now ,it work |
Is there an existing issue for the same bug?
Describe the bug
I'm using version 0.4.0 and going through the guidance of AzureLLMs.md
Configured config.toml, when I executed make run, I encountered the following problems。Is this a bug? Or how should I modify the configuration?
Current Version
Installation and Configuration
Model and Agent
No response
Reproduction Steps
No response
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered: