Skip to content

Any guidance for Ollama API support ? #539

Answered by MaxLeiter
iam4x asked this question in Help
Discussion options

You must be logged in to vote

A PR would be accepted.

You'll likely need to create a wrapper for the Ollama rest API, like here: https://github.com/vercel/ai/blob/main/packages/core/streams/anthropic-stream.ts

Replies: 7 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by iam4x
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Help
Labels
None yet
7 participants