You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It appears that during the model's saving process in fine-tuning, the self.config.torch_dtype was incorrectly set as the string "bfloat16" instead of torch.bfloat16. Here's a simple fix I implemented:
# Embed positionsifinputs_embedsisNone:
inputs_embeds=self.embed_tokens(input_ids)
# My Modificationifself.config.torch_dtype=="bfloat16":
self.config.torch_dtype=torch.bfloat16inputs_embeds=inputs_embeds.to(self.config.torch_dtype)
The text was updated successfully, but these errors were encountered:
While fine-tuning the unsloth/codellama-7b model using transformers v4.40.1 and setting save_strategy=epoch, I encountered the following error:
Upon examining the code, I identified the problematic line at this GitHub location:
It appears that during the model's saving process in fine-tuning, the self.config.torch_dtype was incorrectly set as the string "bfloat16" instead of torch.bfloat16. Here's a simple fix I implemented:
The text was updated successfully, but these errors were encountered: