Issues: promptengineers-ai/llm-server
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Ollama RAG being funny by displaying the underlying system message from question rewrite
#37
opened May 19, 2024 by
ryaneggz
Allow for option to have files be stored as Base64 or stored in Object storage
#25
opened May 18, 2024 by
ryaneggz
Not fully sold that we're allowing for the SYSTEM message to be passed to OpenAI during RAG.
#24
opened May 16, 2024 by
ryaneggz
Currently Embedding Factory defaults "text-embedding-3-small" needs to be dependent on index info
#20
opened May 16, 2024 by
ryaneggz
ProTip!
Exclude everything labeled
bug
with -label:bug.