We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
input_ids = [tokenizer.get_command("[gMASK]"), tokenizer.get_command("sop")] + tokenizer.convert_tokens_to_ids(tokens)请问这行是什么意思,为什么和chatglm版本差别挺大的,为什么可以以现在这种格式写呢?
The text was updated successfully, but these errors were encountered:
我也有这个疑问,按这个格式,我们试用效果很差
Sorry, something went wrong.
因为chatglm2和chatglm官方在训练的时候,用的数据格式就不同。PS:两个模型的结构模型也大不相同。一个是prefix-lm一个是causal-lm
No branches or pull requests
input_ids = [tokenizer.get_command("[gMASK]"),
tokenizer.get_command("sop")] + tokenizer.convert_tokens_to_ids(tokens)请问这行是什么意思,为什么和chatglm版本差别挺大的,为什么可以以现在这种格式写呢?
The text was updated successfully, but these errors were encountered: