Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] 本地的 Ollama 模型无法调用 #2261

Closed
jerlinn opened this issue Apr 28, 2024 · 5 comments
Closed

[Bug] 本地的 Ollama 模型无法调用 #2261

jerlinn opened this issue Apr 28, 2024 · 5 comments
Labels
🐛 Bug Something isn't working | 缺陷

Comments

@jerlinn
Copy link

jerlinn commented Apr 28, 2024

💻 Operating System

macOS

📦 Environment

Docker

🌐 Browser

Chrome

🐛 Bug Description

Mac 本地 Docker 部署之后,无法调用本地的 ollama 模型。

Connectivity Check 通过:
CleanShot 2024-04-28 at 18 24 34@2x

本地模型 :
WX20240428-182748@2x

实际使用时,所有 ollama 模型都返回出错提示:
WX20240428-182521@2x

🚦 Expected Behavior

No response

📷 Recurrence Steps

正常使用都会必现

📝 Additional Information

  • ollma 有正常开启,端口也正常
  • 无论是否设置环境变量,问题依旧
  • 分别尝试了从.zshrc 文件、以及在终端中设置环境变量
@jerlinn jerlinn added the 🐛 Bug Something isn't working | 缺陷 label Apr 28, 2024
@lobehubbot
Copy link
Member

👀 @jerlinn

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

@jerlinn jerlinn changed the title [Bug] [Bug] 本地的 Ollama 模型无法调用 Apr 28, 2024
@arvinxx
Copy link
Contributor

arvinxx commented Apr 28, 2024

能否看下控制台的错误?

@lobehubbot
Copy link
Member

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Can you check the console for errors?

@rickdgit
Copy link

Got the same error
Console output:

17930-c6532f96900818ef.js:1 Route: [ollama] InvalidOllamaArgs: 
{error: undefined, errorType: 'InvalidOllamaArgs'}
error
: 
undefined
errorType
: 
"InvalidOllamaArgs"
[[Prototype]]
: 
Object

@rickdgit
Copy link

Okey I figure it out
Check the Interface proxy address and make sure it start with 'http://'

@lobehub lobehub locked and limited conversation to collaborators Apr 29, 2024
@arvinxx arvinxx converted this issue into discussion #2284 Apr 29, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
🐛 Bug Something isn't working | 缺陷
Projects
Status: Done
Development

No branches or pull requests

4 participants