New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama Error #1807
Comments
I have the same ... |
it seems to be working now :) |
Hi, I'm no expert in langflow but I had a similar issue. Can you describe your langflow deployment? Are you using docker or running on your local machine? Have you tried to make post requests using postman to Ollama? In my case it was a network related error, and I was running langflow and ollama in separate docker containers. Nothing that creating a network and running both in the same network can't fix. But I would need more details to tell what could be happening on your side (reference images on the error would be much appreciated). |
I bet this is an update that might have changed a bug or any other issue I am not familiar with. I have Ollama running on http://127.0.0.1:11434, which is typical, but yesterday there was an error, at midnight it changed ;). One remark, I am not running the app in a container... |
I am running langflow locally on my computer. I have the same problem. |
same problem here. Trying to run a rag flow using ollama. |
my friends,
|
If you are running Ollama in the taskbar, exit out of it. Then go to terminal and start ollama server. It will work. |
Discussed in #1804
Originally posted by VinojRaj April 30, 2024
I am new to Langflow and I was trying to use Llama2 through Ollama as the model but I am getting the following error:
ValueError: Error building vertex Ollama: ChatOllamaComponent.build() missing 1 required positional argument: 'base_url'
The base url is default on http://localhost:11434/
The text was updated successfully, but these errors were encountered: