Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama Local model issue after update #494

Closed
HashemHamdy opened this issue Apr 26, 2024 · 5 comments
Closed

Ollama Local model issue after update #494

HashemHamdy opened this issue Apr 26, 2024 · 5 comments

Comments

@HashemHamdy
Copy link

after clone yesterday version the local model can't be detected to reply even one or two steps like the previous version
24.04.26 21:38:46: root: ERROR : Inference took too long. Model: OLLAMA, Model ID: llama3
24.04.26 21:38:46: root: INFO : SOCKET inference MESSAGE: {'type': 'error', 'message': 'Inference took too long. Please try again.'}
24.04.26 21:38:46: root: WARNING: Inference failed

@nemsip
Copy link

nemsip commented Apr 27, 2024

Same problem for me! Hopefully this get's fixed soon, I was really excited to try this out.

@maysaraanalyst
Copy link

i am also face the same problem
24.04.27 11:48:22: root: INFO : SOCKET tokens MESSAGE: {'token_usage': 730}
Model: mistral, Enum: OLLAMA
24.04.27 11:48:23: root: INFO : SOCKET inference MESSAGE: {'type': 'time', 'elapsed_time': '0.00'}
24.04.27 11:49:24: root: ERROR : Inference took too long. Model: OLLAMA, Model ID: mistral
24.04.27 11:49:24: root: INFO : SOCKET inference MESSAGE: {'type': 'error', 'message': 'Inference took too long. Please try again.'}
24.04.27 11:49:24: root: WARNING: Inference failed

@iChristGit
Copy link

Also for me with a 3090
Inference took too long
I just use a 20GB command-r verison, normal ollama chat works superfast

@shahab00x
Copy link

Same here.

@ARajgor
Copy link
Collaborator

ARajgor commented May 2, 2024

now you can update the inference timeout via the settings page. fetch latest changes.

@ARajgor ARajgor closed this as completed May 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants