Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

load_model failed #389

Open
1 of 2 tasks
wygao8 opened this issue Mar 11, 2024 · 1 comment
Open
1 of 2 tasks

load_model failed #389

wygao8 opened this issue Mar 11, 2024 · 1 comment
Assignees
Labels

Comments

@wygao8
Copy link

wygao8 commented Mar 11, 2024

System Info

[pip3] flake8==7.0.0
[pip3] flake8-bugbear==24.2.6
[pip3] mypy-extensions==1.0.0
[pip3] numpy==1.26.4
[pip3] torch==2.2.1+cu118
[pip3] torchdata==0.7.1+cpu
[pip3] torchtext==0.17.1
[pip3] triton==2.2.0
[conda] numpy 1.26.4 pypi_0 pypi
[conda] torch 2.2.1+cu118 pypi_0 pypi
[conda] torchdata 0.7.1+cpu pypi_0 pypi
[conda] torchtext 0.17.1 pypi_0 pypi
[conda] triton 2.2.0 pypi_0 pypi

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

I followed the instructions to install and infer Llama-2-chat-70b.

pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes
python examples/chat_completion/chat_completion.py --model_name "PATH/TO/MODEL/70B/" --prompt_file examples/chat_completion/chats.json  --quantization --use_auditnlg

However, it failed to load the model and showed errors as follows:

  File "examples/example_chat_completion.py", line 64, in main
    model = load_model(model_name, quantization, use_fast_kernels)
TypeError: load_model() takes 2 positional arguments but 3 were given

I checked the history of llama_recipes.inference.model_utils.load_model and found it changed last month. I installed llama-recipes from source and fixed this issue.

Is the pip-downloaded version outdated?

Error logs

  File "examples/example_chat_completion.py", line 64, in main
    model = load_model(model_name, quantization, use_fast_kernels)
TypeError: load_model() takes 2 positional arguments but 3 were given

Expected behavior

Update the pip packages.

@HamidShojanazeri
Copy link
Contributor

Thanks @wygao8 for reporting this, we are getting ready for a new release in the meantime please install from src pip install -e . that should help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants