Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] overrideConfig for parameter in 2 nodes not working as documented #2359

Closed
grimaldilionbrain opened this issue May 8, 2024 · 4 comments
Labels
bug Something isn't working question Further information is requested

Comments

@grimaldilionbrain
Copy link

Describe the bug

In calling a prediction endpoint of a chatbot via API, I tried using the overrideConfig method for passing an overrideConfig to two parameters named the same way in two different nodes (in my case openAIEmbeddings and chatOpenAI, both having a modelName parameter), However, the call seems to allow override of all chatflow parameters except the modelName for either of the two nodes. The value of modelName remains fixed to the default, regardless of the values given in overrideConfig. This happens despite the fact that I am specifying the two model names as described in the documentation (in the Embed window of the chatflow web interface):

body_data = {
        "question": <some_question>,
        "overrideConfig": {
            "modelName[openAIEmbeddings_0]": <embedding_model_name>,
            "modelName[chatOpenAI_0]": <chat_model_name>,
            ...
        }
}

image

(I know that the structure of the body_data argument is different in the image snippet, with no "overrideConfig", but I followed the structure of the main documentation. Also, trying to use the format without "overrideConfig" results in none of the parameters being updated)

(I double checked, that seems to be the name of the two nodes, at least according to the export of the chatflow schema)

If I don't specify the node name (i.e. just using one "modelName": ...), the value seems to be read correctly, but then of course one of the two nodes complains about a non-existing model.

The issue seem to be caused by using the "<param_name>[<node_id>]" syntax, since using it for a uniquely named parameter in the chatflow also results in the override value not being used.

To Reproduce
I am calling the API from a python script, following the format suggested in the API documentation:

# ... API url and key were read from .env, overrideConfigs stored in settings dictionary
headers = {"Authorization": f"Bearer {API_KEY}"}

def query(body_data) -> Response:
    response = requests.post(API_URL, json=body_data, headers=headers)
    return response

def main() -> None:
    body_data = {
        "question": settings["question"],
        "overrideConfig": {
            "modelName[openAIEmbeddings_0]": settings["modelName_embeddings"],
            "modelName[chatOpenAI_0]": settings["modelName_chat"],
            # other params
        },
    }
    output = query(body_data)

The chatflow schema is a simple qna from a text-file document with in-memory store:

image

Expected behavior
When using an overrideConfig parameter containing the fields "modelName[<node_id>] is expect the value of the modelName parameter to be updated.

Flow

test_overrideConfig_issue Chatflow.json

Setup

  • Installed via docker
  • Flowise Version: 1.4.12
  • OS: Amazon Linux 2023
  • Browser: n.a.
@HenryHengZJ
Copy link
Contributor

HenryHengZJ commented May 8, 2024

can you try this instead?

# ... API url and key were read from .env, overrideConfigs stored in settings dictionary
headers = {"Authorization": f"Bearer {API_KEY}"}

def query(body_data) -> Response:
    response = requests.post(API_URL, json=body_data, headers=headers)
    return response

def main() -> None:
    body_data = {
        "question": settings["question"],
        "overrideConfig": {
            "modelName": {
                 "openAIEmbeddings_0": model1,
                 "chatOpenAI_0": model2
            }
        },
    }
    output = query(body_data)

I think whats displayed on the UI might be lil bit off, let me know if the above works

@HenryHengZJ HenryHengZJ added bug Something isn't working question Further information is requested labels May 8, 2024
@grimaldilionbrain
Copy link
Author

Hi Henry, thanks for the suggestion. It still seems like something is off, since I now get this error:

DEBUG:root:b"400 We could not parse the JSON body of your request. (HINT: This likely means you aren't using your HTTP library correctly. The OpenAI API expects a JSON payload, but what was sent was not valid JSON. If you have trouble figuring out how to fix this, please contact us through our help center at help.openai.com.)"

(to me it seems like the value of the "modelName" field is still sent all to both of the nodes, leading to OpenAI API complaining that they are getting the wrong format for modelName in the call)

@HenryHengZJ
Copy link
Contributor

I've tested it works:
image

check if you are passing in correct model name (string)

@grimaldilionbrain
Copy link
Author

You are right, somehow I must have made a mistake in calling the model name. Sorry about that, trying again with your suggested format worked well. Thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants