Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhancement: Introduce the feature to invoke ollama through API on a remote machine #2581

Closed
1 task done
derhelge opened this issue Apr 30, 2024 · 0 comments
Closed
1 task done
Labels
enhancement New feature or request

Comments

@derhelge
Copy link

What features would you like to see added?

I would like to request the implemented ability for ollama to be run on a remote server via API. This comes from the need to leverage more computational power of individual models. An nginx reverse proxy is used on the server to handle Bearer Token Authentication. To have this work effectively, necessary adaptations need to be made to librechat. Your assistance in this aspect is appreciated.

More details

I think, that ModelService and the documentation should be edited.

Which components are impacted by your request?

No response

Pictures

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct
@derhelge derhelge added the enhancement New feature or request label Apr 30, 2024
Repository owner locked and limited conversation to collaborators Apr 30, 2024
@danny-avila danny-avila converted this issue into discussion #2583 Apr 30, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant