New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What is the input value for vector store search when running as json flow? #1870
Comments
The Regarding the for result_data in result.outputs:
if result_data is not None:
print(result_data.results) Ensure that the result = run_flow_from_json(
flow='my_flow.json',
input_value='my_input',
output_component='AstraDBSearch-avH6c', # Ensure this matches your component's ID
tweaks=TWEAKS,
) Make sure your component IDs and the way you access the output data align with your flow's configuration and the
|
I see -- that's my mistake. The However, I did have to add the
Adding the
correctly produced |
@dosu thanks, the issue relating to the However, I am still curious about how I can run the flow by using the |
To use the The If you're encountering issues with the Regarding the tweaks and adjustments you've attempted by directly modifying the For your specific case, ensure that:
This approach should allow you to run your flow as intended, with the
|
Summary
Overall question: How do I know what is a valid input key and valid output component for my flows when exporting them to json to run?
I have a simple flow, with an OpenAIEmbeddings component hooked up to an AstraDBSearch. I have no other values set in my flow when I export it to json.
The given python code is:
Once I add in my API keys / endpoint, I still have an empty
input value
. That makes sense to me, as I imagine I can do:However, it fails to build, saying:
Now, just for the sake of the argument, I added it directly to the tweaks instead:
Now, I get the error:
So questions for that are:
input_value
inrun_flow_from_json
serve in this context?** Version **
To Reproduce
create flow, export to json, try run as json
Thanks!
Additional context
The text was updated successfully, but these errors were encountered: