Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatGLM3-6b测试模型时报错AttributeError: can't set attribute #636

Open
padsasdasd opened this issue May 1, 2024 · 1 comment
Open
Assignees

Comments

@padsasdasd
Copy link

xtuner chat /root/autodl-tmp/add --prompt-template default

Traceback (most recent call last):
File "/root/ChatGLM3/xtuner/xtuner/tools/chat.py", line 491, in
main()
File "/root/ChatGLM3/xtuner/xtuner/tools/chat.py", line 237, in main
tokenizer = AutoTokenizer.from_pretrained(
File "/root/miniconda3/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 774, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained
return cls._from_pretrained(
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/root/.cache/huggingface/modules/transformers_modules/add/tokenization_chatglm.py", line 108, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces,
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils.py", line 363, in init
super().init(**kwargs)
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1602, in init
super().init(**kwargs)
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 861, in init
setattr(self, key, value)
AttributeError: can't set attribute

模型合并已经成功了,测试时一直出现错误,求大佬解答。

@LZHgrla
Copy link
Collaborator

LZHgrla commented May 6, 2024

@padsasdasd

这是一个chatglm3系列模型的已知问题,类似 issue 有 #221

可以考虑将模型中的 tokenizer 替换为训练前的 tokenizer 文件,以解决此问题

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants