Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama Error #1807

Open
VinojRaj opened this issue Apr 30, 2024 Discussed in #1804 · 8 comments
Open

Ollama Error #1807

VinojRaj opened this issue Apr 30, 2024 Discussed in #1804 · 8 comments

Comments

@VinojRaj
Copy link

Discussed in #1804

Originally posted by VinojRaj April 30, 2024
I am new to Langflow and I was trying to use Llama2 through Ollama as the model but I am getting the following error:
ValueError: Error building vertex Ollama: ChatOllamaComponent.build() missing 1 required positional argument: 'base_url'

The base url is default on http://localhost:11434/

@qwaszaq
Copy link

qwaszaq commented Apr 30, 2024

I have the same ...

@qwaszaq
Copy link

qwaszaq commented May 1, 2024

it seems to be working now :)

@AngelDPena
Copy link

Hi, I'm no expert in langflow but I had a similar issue. Can you describe your langflow deployment? Are you using docker or running on your local machine? Have you tried to make post requests using postman to Ollama?

In my case it was a network related error, and I was running langflow and ollama in separate docker containers. Nothing that creating a network and running both in the same network can't fix. But I would need more details to tell what could be happening on your side (reference images on the error would be much appreciated).

@qwaszaq
Copy link

qwaszaq commented May 1, 2024

I bet this is an update that might have changed a bug or any other issue I am not familiar with. I have Ollama running on http://127.0.0.1:11434, which is typical, but yesterday there was an error, at midnight it changed ;). One remark, I am not running the app in a container...

@RaminParker
Copy link

I am running langflow locally on my computer. I have the same problem.
What exactly is the base_url?

@JavierCCC
Copy link

same problem here. Trying to run a rag flow using ollama.

@maga868
Copy link

maga868 commented May 23, 2024

my friends,
here is what helped for me:
Bring both container (langflow and ollama) in the same network:

  1. create new network "m-net"
    docker network create my-net
  2. find container names (last column, withe ollama and docker_example-langflow-1):
    docker ps
  3. add first container
    docker network connect my-net ollama
    4.add second container
    docker network connect my-net docker_example-langflow-1
    5.use container name as baseUrl: e.g ‘ollama:11434’ for ollama
    hope that helps.

@adnanbwp
Copy link

If you are running Ollama in the taskbar, exit out of it. Then go to terminal and start ollama server. It will work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants