New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chat format ignored #53
Comments
Switching on debug output it seems to be applied correctly. Does the |
Regarding order:
Right? |
@woheller69 I currently use my own messages formatter system and the completion endpoints, not the chat completion. I think the Using fallback chat format: None is from llama-cpp-python. You can ignore it, since I don't use the chat completion endpoints. |
OK, this means it has to be always specified. It would be nice if in future there was an option to apply the format from the gguf because it is often difficult to find out which template is required. And the newer ggufs usually contain the information... |
Trying example:
chatbot_using_local_model.py
with WizardLM2 (WizardLM-2-7B.Q8_0.gguf)gives:
but the example defines CHATML as format:
predefined_messages_formatter_type=MessagesFormatterType.CHATML
is chat format ignored?
The text was updated successfully, but these errors were encountered: