Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] RuntimeError: Invalid device string: 'bfloat16' with transformers v4.40.1 and save_strategy="epoch" #404

Closed
OAHC2022 opened this issue Apr 30, 2024 · 2 comments

Comments

@OAHC2022
Copy link

OAHC2022 commented Apr 30, 2024

While fine-tuning the unsloth/codellama-7b model using transformers v4.40.1 and setting save_strategy=epoch, I encountered the following error:

line 540, in LlamaModel_fast_forward
    inputs_embeds = inputs_embeds.to(self.config.torch_dtype)
RuntimeError: Invalid device string: 'bfloat16'

Upon examining the code, I identified the problematic line at this GitHub location:

# Embed positions
if inputs_embeds is None:
    inputs_embeds = self.embed_tokens(input_ids)

inputs_embeds = inputs_embeds.to(self.config.torch_dtype)

It appears that during the model's saving process in fine-tuning, the self.config.torch_dtype was incorrectly set as the string "bfloat16" instead of torch.bfloat16. Here's a simple fix I implemented:

# Embed positions
if inputs_embeds is None:
    inputs_embeds = self.embed_tokens(input_ids)

# My Modification
if self.config.torch_dtype == "bfloat16":
    self.config.torch_dtype = torch.bfloat16
inputs_embeds = inputs_embeds.to(self.config.torch_dtype)
@danielhanchen
Copy link
Contributor

Oh thanks for that!! Will add your fix in! Thanks!

@OAHC2022
Copy link
Author

OAHC2022 commented May 6, 2024

Thank you!

@OAHC2022 OAHC2022 closed this as completed May 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants