Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NameError: name 'awq_ext' is not defined #3465

Closed
1 task done
Anorid opened this issue Apr 26, 2024 · 8 comments
Closed
1 task done

NameError: name 'awq_ext' is not defined #3465

Anorid opened this issue Apr 26, 2024 · 8 comments
Labels
solved This problem has been already solved.

Comments

@Anorid
Copy link

Anorid commented Apr 26, 2024

Reminder

  • I have read the README and searched the existing issues.

Reproduction

Exception in thread Thread-6 (run_exp):
Traceback (most recent call last):
File "/root/miniconda3/envs/llama-factory/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/root/miniconda3/envs/llama-factory/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/train/tuner.py", line 33, in run_exp
run_sft(model_args, data_args, training_args, finetuning_args, generating_args, callbacks)
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/train/sft/workflow.py", line 73, in run_sft
train_result = trainer.train(resume_from_checkpoint=training_args.resume_from_checkpoint)
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/transformers/trainer.py", line 1624, in train
return inner_training_loop(
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/transformers/trainer.py", line 1961, in _inner_training_loop
tr_loss_step = self.training_step(model, inputs)
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/transformers/trainer.py", line 2911, in training_step
self.accelerator.backward(loss)
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/accelerate/accelerator.py", line 2011, in backward
self.scaler.scale(loss).backward(**kwargs)
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/torch/_tensor.py", line 525, in backward
torch.autograd.backward(
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/torch/autograd/init.py", line 267, in backward
_engine_run_backward(
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/torch/autograd/graph.py", line 744, in _engine_run_backward
return Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/torch/autograd/function.py", line 301, in apply
return user_fn(self, *args)
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/torch/utils/checkpoint.py", line 320, in backward
torch.autograd.backward(outputs_with_grad, args_with_grad)
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/torch/autograd/init.py", line 267, in backward
_engine_run_backward(
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/torch/autograd/graph.py", line 744, in _engine_run_backward
return Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/torch/autograd/function.py", line 301, in apply
return user_fn(self, *args)
File "/root/miniconda3/envs/llama-factory/lib/python3.10/site-packages/awq/modules/linear/gemm.py", line 66, in backward
if awq_ext is None:
NameError: name 'awq_ext' is not defined

Expected behavior

预期是正常跑通

System Info

No response

Others

No response

@hiyouga
Copy link
Owner

hiyouga commented Apr 26, 2024

卸载 autoawq 并从源码重新安装 pip install git+https://github.com/casper-hansen/AutoAWQ.git

@hiyouga hiyouga added the solved This problem has been already solved. label Apr 26, 2024
@GavinWLove
Copy link

卸载 autoawq 并从源码重新安装 pip install git+https://github.com/casper-hansen/AutoAWQ.git

还是报同样的错误。
ps:
(llama_factory) root@I199833402100901e91:/LLaMA-Factory-main# pip show autoawq
Name: autoawq
Version: 0.2.4+cu121
Summary: AutoAWQ implements the AWQ algorithm for 4-bit quantization with a 2x speedup during inference.
Home-page: https://github.com/casper-hansen/AutoAWQ
Author: Casper Hansen

@hiyouga
Copy link
Owner

hiyouga commented Apr 26, 2024

@GavinWLove
Copy link

这个也安装一下 https://github.com/casper-hansen/AutoAWQ_kernels#requirements

感谢,可以微调,无误。

@hiyouga hiyouga closed this as completed Apr 28, 2024
@Anorid
Copy link
Author

Anorid commented Apr 30, 2024

git clone https://github.com/casper-hansen/AutoAWQ
cd AutoAWQ
pip install -e .

我按照这个按照爆的错

(ll) root@autodl-container-3de51192ae-43a34a6c:/autodl-tmp/AutoAWQ# cd ..
(ll) root@autodl-container-3de51192ae-43a34a6c:
/autodl-tmp# cd LLaMA-Factory/
(ll) root@autodl-container-3de51192ae-43a34a6c:/autodl-tmp/LLaMA-Factory# cd src/
(ll) root@autodl-container-3de51192ae-43a34a6c:
/autodl-tmp/LLaMA-Factory/src# CUDA_VISIBLE_DEVICES=0 python train_web.py
Traceback (most recent call last):
File "/root/autodl-tmp/LLaMA-Factory/src/train_web.py", line 1, in
from llmtuner import create_ui
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/init.py", line 3, in
from .api import create_app
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/api/init.py", line 1, in
from .app import create_app
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/api/app.py", line 8, in
from ..chat import ChatModel
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/chat/init.py", line 2, in
from .chat_model import ChatModel
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/chat/chat_model.py", line 5, in
from ..hparams import get_infer_args
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/hparams/init.py", line 6, in
from .parser import get_eval_args, get_infer_args, get_train_args
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/hparams/parser.py", line 14, in
from ..extras.misc import check_dependencies, get_current_device
File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/extras/misc.py", line 6, in
from peft import PeftModel
File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/init.py", line 22, in
from .auto import (
File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/auto.py", line 32, in
from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING
File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/mapping.py", line 22, in
from .mixed_model import PeftMixedModel
File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/mixed_model.py", line 26, in
from peft.tuners.mixed import COMPATIBLE_TUNER_TYPES
File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/tuners/init.py", line 21, in
from .lora import LoraConfig, LoraModel, LoftQConfig
File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/tuners/lora/init.py", line 20, in
from .model import LoraModel
File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/tuners/lora/model.py", line 49, in
from .awq import dispatch_awq
File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/tuners/lora/awq.py", line 26, in
from awq.modules.linear import WQLinear_GEMM
File "/root/autodl-tmp/AutoAWQ/awq/init.py", line 2, in
from awq.models.auto import AutoAWQForCausalLM
File "/root/autodl-tmp/AutoAWQ/awq/models/init.py", line 19, in
from .starcoder2 import Starcoder2AWQForCausalLM
File "/root/autodl-tmp/AutoAWQ/awq/models/starcoder2.py", line 7, in
from transformers.models.starcoder2.modeling_starcoder2 import (
ModuleNotFoundError: No module named 'transformers.models.starcoder2'

@L9qmzn
Copy link

L9qmzn commented May 2, 2024

git clone https://github.com/casper-hansen/AutoAWQ cd AutoAWQ pip install -e .

我按照这个按照爆的错

(ll) root@autodl-container-3de51192ae-43a34a6c:/autodl-tmp/AutoAWQ# cd .. (ll) root@autodl-container-3de51192ae-43a34a6c:/autodl-tmp# cd LLaMA-Factory/ (ll) root@autodl-container-3de51192ae-43a34a6c:/autodl-tmp/LLaMA-Factory# cd src/ (ll) root@autodl-container-3de51192ae-43a34a6c:/autodl-tmp/LLaMA-Factory/src# CUDA_VISIBLE_DEVICES=0 python train_web.py Traceback (most recent call last): File "/root/autodl-tmp/LLaMA-Factory/src/train_web.py", line 1, in from llmtuner import create_ui File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/init.py", line 3, in from .api import create_app File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/api/init.py", line 1, in from .app import create_app File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/api/app.py", line 8, in from ..chat import ChatModel File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/chat/init.py", line 2, in from .chat_model import ChatModel File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/chat/chat_model.py", line 5, in from ..hparams import get_infer_args File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/hparams/init.py", line 6, in from .parser import get_eval_args, get_infer_args, get_train_args File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/hparams/parser.py", line 14, in from ..extras.misc import check_dependencies, get_current_device File "/root/autodl-tmp/LLaMA-Factory/src/llmtuner/extras/misc.py", line 6, in from peft import PeftModel File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/init.py", line 22, in from .auto import ( File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/auto.py", line 32, in from .mapping import MODEL_TYPE_TO_PEFT_MODEL_MAPPING File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/mapping.py", line 22, in from .mixed_model import PeftMixedModel File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/mixed_model.py", line 26, in from peft.tuners.mixed import COMPATIBLE_TUNER_TYPES File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/tuners/init.py", line 21, in from .lora import LoraConfig, LoraModel, LoftQConfig File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/tuners/lora/init.py", line 20, in from .model import LoraModel File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/tuners/lora/model.py", line 49, in from .awq import dispatch_awq File "/root/miniconda3/envs/ll/lib/python3.10/site-packages/peft/tuners/lora/awq.py", line 26, in from awq.modules.linear import WQLinear_GEMM File "/root/autodl-tmp/AutoAWQ/awq/init.py", line 2, in from awq.models.auto import AutoAWQForCausalLM File "/root/autodl-tmp/AutoAWQ/awq/models/init.py", line 19, in from .starcoder2 import Starcoder2AWQForCausalLM File "/root/autodl-tmp/AutoAWQ/awq/models/starcoder2.py", line 7, in from transformers.models.starcoder2.modeling_starcoder2 import ( ModuleNotFoundError: No module named 'transformers.models.starcoder2'

我把transformers的版本更新到4.40.1解决了这个问题。

@Anorid
Copy link
Author

Anorid commented May 6, 2024

image

@Anorid
Copy link
Author

Anorid commented May 6, 2024

我安装都不行了

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved This problem has been already solved.
Projects
None yet
Development

No branches or pull requests

4 participants