Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: __init__() got an unexpected keyword argument 'llm_int8_skip_modules' #43

Open
AlexeiKaDev opened this issue Nov 26, 2023 · 0 comments

Comments

@AlexeiKaDev
Copy link

Hi. i try to train localy with my RTX3060 on windows 10. Can somebody help me with this erorr?

I think i did this steps to start it works with cuda

python -m venv lora
.\lora\Scripts\activate

pip install -r requirements.lock.txt

pip install pynvml==11.0.0

pip uninstall bitsandbytes

pip install R:/llama/bitsandbytes-0.41.2.post2-py3-none-win_amd64.whl  ( install from my disk)

pip install torch torchvision torchaudio -f https://download.pytorch.org/whl/cu118/torch_stable.html

pip install --upgrade transformers torch

pip install bitsandbytes --upgrade

And this is erorr

(lora) R:\llama\lora>python app.py --data_dir="./data" --base_model='meta-llama/Llama-2-7b-chat-hf'
fatal: not a git repository (or any of the parent directories): .git
Cannot get git commit hash: Command '['git', 'rev-parse', 'HEAD']' returned non-zero exit status 128.
bin R:\llama\lora\lora\lib\site-packages\bitsandbytes\libbitsandbytes_cuda118.dll

GPU compute capability:  (8, 6)
GPU total number of SMs:  28
GPU total cores:  3584
GPU total memory: 12884901888 bytes (12288.00 MB) (12.00 GB)
CPU available memory: 52328894464 bytes (49904.72 MB) (48.74 GB)
Will keep 2 offloaded models in CPU RAM.

Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 2/2 [00:07<00:00,  3.58s/it]
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
Loading base model meta-llama/Llama-2-7b-chat-hf...
Traceback (most recent call last):
  File "R:\llama\lora\llama_lora\ui\finetune\training.py", line 283, in training
    train_output = Global.finetune_train_fn(
  File "R:\llama\lora\llama_lora\lib\finetune.py", line 203, in train
    model = AutoModelForCausalLM.from_pretrained(
  File "R:\llama\lora\lora\lib\site-packages\transformers\models\auto\auto_factory.py", line 566, in from_pretrained
    return model_class.from_pretrained(
  File "R:\llama\lora\lora\lib\site-packages\transformers\modeling_utils.py", line 3236, in from_pretrained
    model = cls(config, *model_args, **model_kwargs)
TypeError: __init__() got an unexpected keyword argument 'llm_int8_skip_modules'
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant