Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PromptRunner does not support AzureChatOpenAI as an LLM input #1823

Open
tonydoesathing opened this issue May 2, 2024 · 1 comment
Open
Labels
bug Something isn't working

Comments

@tonydoesathing
Copy link

Awesome project! Was playing around with the tool and hit a wall, since I only have access to Azure's OpenAI...

Describe the bug
PromptRunner does not support AzureChatOpenAI as an LLM input

Browser and Version

  • chrome
  • Version 124.0.6367.78 (Official Build) (64-bit)

To Reproduce
Steps to reproduce the behavior:

  1. Go to 'New Project'
  2. Under LLMs, drag out a new AzureChatOpenAI
  3. Fill in the data required for the LLM
  4. Under Chains, drag out a new PromptRunner chain
  5. Try to connect the BaseLanguageModel output of the AzureChatOpenAI block to the LLM input on the PromptRunner block and struggle

Expected Behavior
I should be able to connect AzureChatOpenAI's output to the LLM input on any block

Copy link

dosubot bot commented May 2, 2024

To address the issue of connecting the BaseLanguageModel output of the AzureChatOpenAI block to the LLM input on the PromptRunner block, follow these steps:

  1. Check Configuration Parameters: Ensure the configuration for AzureChatOpenAIComponent is correct, including model, azure_endpoint, azure_deployment, api_version, api_key, temperature, and max_tokens. The api_version should match the supported versions, like "2023-12-01-preview".

  2. Error Handling: If there's a ValueError, it might be due to incorrect parameters or network issues when connecting to Azure OpenAI API. This indicates a problem in the instantiation process of AzureChatOpenAI.

  3. Compatibility Check: Make sure the PromptRunner block is compatible with the latest AzureChatOpenAI class. The AzureChatOpenAI class is designed to work with PromptRunner blocks that accept BaseLanguageModel objects, but compatibility issues can arise with updates.

  4. Update Dependencies: Ensure all dependencies, including langchain, langflow, and Azure SDKs, are current. Outdated dependencies can lead to compatibility problems.

  5. Browser Compatibility: Test the setup in another browser or incognito mode to rule out browser-specific issues.

  6. Review Documentation: The documentation for AzureChatOpenAIComponent might offer additional troubleshooting steps or insights.

If these steps don't resolve the issue, providing the exact error message or behavior observed when attempting the connection could help with further troubleshooting.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@dosubot dosubot bot added the bug Something isn't working label May 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant