This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Enhancement: Introduce the feature to invoke ollama through API on a remote machine #2581
Closed
1 task done
Labels
enhancement
New feature or request
What features would you like to see added?
I would like to request the implemented ability for ollama to be run on a remote server via API. This comes from the need to leverage more computational power of individual models. An nginx reverse proxy is used on the server to handle Bearer Token Authentication. To have this work effectively, necessary adaptations need to be made to librechat. Your assistance in this aspect is appreciated.
More details
I think, that ModelService and the documentation should be edited.
Which components are impacted by your request?
No response
Pictures
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: