New llama-factory code runs into batch["input_ids"] is None. The old version is ok. #3463
Closed
1 task done
Labels
invalid
This doesn't seem right
Reminder
Reproduction
Traceback (most recent call last):
File "/root/LLaMA-Factory/src/train_bash.py", line 14, in
main()
File "/root/LLaMA-Factory/src/train_bash.py", line 5, in main
run_exp()
File "/root/LLaMA-Factory/src/llmtuner/train/tuner.py", line 31, in run_exp
run_pt(model_args, data_args, training_args, finetuning_args, callbacks)
File "/root/LLaMA-Factory/src/llmtuner/train/pt/workflow.py", line 47, in run_pt
train_result = trainer.train(resume_from_checkpoint=training_args.resume_from_checkpoint)
File "/root/anaconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/trainer.py", line 1780, in train
return inner_training_loop(
File "/root/anaconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/trainer.py", line 1843, in _inner_training_loop
num_train_tokens = self.num_tokens(train_dataloader) * args.num_train_epochs
File "/root/anaconda3/envs/llama_factory/lib/python3.10/site-packages/transformers/trainer.py", line 1360, in num_tokens
tokens = batch["input_ids"].numel()
TypeError: 'NoneType' object is not subscriptable
Expected behavior
No response
System Info
No response
Others
No response
The text was updated successfully, but these errors were encountered: