New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't make it work with LMStudio #1224
Comments
@tthierryEra can you share your config.json so I can better help debug? |
Hello Thank you I did not changed anything else than adding models |
@tthierryEra thanks. I think maybe the best place to check would be in the "Output" tab next to the VS Code terminal, and then in the dropdown on the right select "Continue - LLM Prompts/Completions". This will show the exact prompt send to the LLM. This in addition to the logs that you see on the side of LM Studio will likely show us what we need to see (it's definitely just a prompt formatting mistake). The first possible solution I can think of is to double-check your prompt formatting settings on the LM Studio side. I believe you can edit these in the right side panel |
So from my testing, whenever I add a system prompt wheter it's directly in LMStudio or in continue.dev code like config.ts, that system prompt will be printed back to the chat. If I leave everything empty it's "working" but really not useful as the llm does not know how to react/interact. Exemple: Output from continue
Completion:
I tried to leave everything empty. Or add as the prefix for user message.
Maybe I'm doing something wrong? |
I just ran into this problem and fixed it on my end. The broken behavior happens when running LM Studio 0.2.22. Reverting back to the 0.2.21 version has fixed the prompting formatting. |
You closed the issue because the solution is to downgrade LMStudio? |
@xinnod Where did you found the old version? Thanks |
Before submitting your bug report
Relevant environment info
Description
Hello,
When using LMStudio as result is the same, in the chatbot and in file I always get the system prompt back and all whatever I try on the prompt formating is not working.
Really need help here. It's working with ollama and external apis.
Thanks
To reproduce
No response
Log output
No response
The text was updated successfully, but these errors were encountered: