Skip to content

How do you setup a custom local LLM that is compatible with OpenAI? #11785

Closed Answered by Y4hL
FrozzDay asked this question in Q&A
Discussion options

You must be logged in to vote

There was just a patch sent out that provides documentation for setting up OpenAI API compatible endpoints: Commit

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by mrnugget
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants