Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Ollama local api embeddings. #1369

Open
atljoseph opened this issue May 12, 2024 · 0 comments · May be fixed by #1384
Open

Support Ollama local api embeddings. #1369

atljoseph opened this issue May 12, 2024 · 0 comments · May be fixed by #1384
Assignees

Comments

@atljoseph
Copy link

Is your feature request related to a problem? Please describe.
OpenAI and azure are supported, along with transformers. You support use of Ollama. Ollama has embedding endpoints. The capability is literally right there. Please wire it up. I don’t use hugging face, azure, or openai. And the local one is slow.

Describe the solution you'd like
Above.

Describe alternatives you've considered
Above.

Additional context
Above.

Make it easier to get started too.

And thanks for the dev portal. PLEASE keep improving it. Also, the button to go do chat is SO hard to see. Took me an hour to discover it. Was going crazy saying to myself, “is that ALL that this thing does?!”.

A little bit of refinement goes a long way towards acceptance.

@sarahwooders sarahwooders self-assigned this May 12, 2024
@sarahwooders sarahwooders linked a pull request May 16, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: To triage
Development

Successfully merging a pull request may close this issue.

2 participants