Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Minor] peft bug fix: HF peft version and tokenizer path in peft scripts #493

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

realAsma
Copy link

@realAsma realAsma commented Dec 24, 2023

  1. HF peft LoraLayer init API have changed v0.7.0. With HF peft >= 0.7.0, I get the error in the attached error while trying to run examples/peft/peft_lora_clm_instruction_tuning.py:
    return GPTQLoraLinear(
  File "/workspace/AutoGPTQ/auto_gptq/utils/peft_utils.py", line 61, in __init__
    torch.nn.Linear.__init__(self, in_features, out_features)
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/torch/nn/modules/linear.py", line 96, in __init__
    self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1712, in __setattr__
    self.register_parameter(name, value)
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/torch/nn/modules/module.py", line 577, in register_parameter
    elif hasattr(self, name) and name not in self._parameters:
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 358, in weight
    weight = base_layer.weight
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 358, in weight
    weight = base_layer.weight
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 358, in weight
    weight = base_layer.weight
  [Previous line repeated 975 more times]
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 352, in weight
    base_layer = self.get_base_layer()
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/peft/tuners/tuners_utils.py", line 341, in get_base_layer
    while hasattr(base_layer, "base_layer"):
  File "/home/akuriparambi/anaconda3/envs/agptq/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1695, in __getattr__
    raise AttributeError(f"'{type(self).__name__}' object has no attribute '{name}'")
RecursionError: maximum recursion depth exceeded while calling a Python object

I have added the left installation instructions to the examples README.

  1. A minor bug in examples/peft - tokenizer_name_or_path is used to initialize AutoTokenizer instead of model_name_or_path:
    model_name_or_path points to the GPTQ saved model which might not have the tokenizer saved. tokenizer_name_or_path should be used instead.

@realAsma realAsma changed the title [Minor] peft bug fix: HF peft version and tokenizer path in left scripts [Minor] peft bug fix: HF peft version and tokenizer path in peft scripts Dec 24, 2023
@YooSungHyun
Copy link

what version do you use auto_gptq?
on my side, error raised can't set attribute 'active_adapter'

i used peft 0.6.2 and auto-gptq 0.6.0+cu118

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants