You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In current support for open-souce models we have a function called message_to_prompt to convert the openai user-assistant alternating format into model corresponding template. I believe it is just what the .apply_chat_template method do. For models which don't have a build-in template or need special treatment we could consider writing a function for these use cases.
Emmm ok I see some inflexibilities of directly applying this method like it cannot apply to a single message but should be a chat history. During inference it is fine since we always fed the whole history to agents, but the rest of the time like how we need it in the score_based.py it just doesn't work. Maybe we should come up a better solution to utilize the tokenizer shipped chat_template and some customization of ours.
Required prerequisites
Motivation
In current support for open-souce models we have a function called
message_to_prompt
to convert the openai user-assistant alternating format into model corresponding template. I believe it is just what the.apply_chat_template
method do. For models which don't have a build-in template or need special treatment we could consider writing a function for these use cases.ref:
https://huggingface.co/docs/transformers/main/en/chat_templating
camel/camel/utils/token_counting.py
Line 21 in d7e4924
Solution
No response
Alternatives
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: