Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Llama2-7b doesn't support transformers >=4.38 #31

Open
hkvision opened this issue Apr 8, 2024 · 2 comments
Open

Llama2-7b doesn't support transformers >=4.38 #31

hkvision opened this issue Apr 8, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@hkvision
Copy link
Collaborator

hkvision commented Apr 8, 2024

Traceback (most recent call last):
  File "/home/arda/kai/webui/text-generation-webui/modules/callbacks.py", line 61, in gentask
    ret = self.mfunc(callback=_callback, *args, **self.kwargs)
  File "/home/arda/kai/webui/text-generation-webui/modules/text_generation.py", line 392, in generate_with_callback
    shared.model.generate(**kwargs)
  File "/opt/anaconda3/envs/text-webui-upstream/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/opt/anaconda3/envs/text-webui-upstream/lib/python3.9/site-packages/transformers/generation/utils.py", line 1592, in generate
    return self.sample(
  File "/opt/anaconda3/envs/text-webui-upstream/lib/python3.9/site-packages/transformers/generation/utils.py", line 2696, in sample
    outputs = self(
  File "/opt/anaconda3/envs/text-webui-upstream/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/opt/anaconda3/envs/text-webui-upstream/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/opt/anaconda3/envs/text-webui-upstream/lib/python3.9/site-packages/transformers/models/llama/modeling_llama.py", line 1176, in forward
    outputs = self.model(
  File "/opt/anaconda3/envs/text-webui-upstream/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/opt/anaconda3/envs/text-webui-upstream/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
TypeError: llama_model_forward_4_36() got an unexpected keyword argument 'cache_position'
Output generated in 0.17 seconds (0.00 tokens/s, 0 tokens, context 72, seed 1344122438)

One week ago they have upgraded to 4.39...
oobabooga@3ce0d92

@hkvision hkvision added the enhancement New feature or request label Apr 8, 2024
@hkvision
Copy link
Collaborator Author

hkvision commented Apr 8, 2024

Mistral works. Using Mistral to test for transformers 4.38.
To run llama, currently need to downgrade transformers to 4.37.

cc @jason-dai @sgwhat @shane-huang

@github-actions github-actions bot added the stale label May 20, 2024
Copy link

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

@hkvision hkvision reopened this May 21, 2024
@github-actions github-actions bot removed the stale label May 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant