Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installation Issue #12

Open
orrzohar opened this issue Apr 30, 2024 · 2 comments
Open

Installation Issue #12

orrzohar opened this issue Apr 30, 2024 · 2 comments

Comments

@orrzohar
Copy link

I followed your installation protocol:

git clone https://github.com/mbzuai-oryx/LLaVA-pp.git
cd LLaVA-pp
git submodule update --init --recursive

# Copy necessary files
cp LLaMA-3-V/train.py LLaVA/llava/train/train.py
cp LLaMA-3-V/conversation.py LLaVA/llava/conversation.py
cp LLaMA-3-V/builder.py LLaVA/llava/model/builder.py
cp LLaMA-3-V/llava_llama.py LLaVA/llava/model/language_model/llava_llama.py

# Training commands
cp scripts/LLaMA3-V_pretrain.sh LLaVA/LLaMA3-V_pretrain.sh
cp scripts/LLaMA3-V_finetune_lora.sh LLaVA/LLaMA3-V_finetune_lora.sh

conda create -n llavapp python=3.10 -y
conda activate llavapp
pip install --upgrade pip  # enable PEP 660 support

cd LLaVA
pip install -e .

pip install -e ".[train]"
pip install flash-attn --no-build-isolation
pip install git+https://github.com/huggingface/transformers@a98c41798cf6ed99e1ff17e3792d6e06a2ff2ff3

I then run

bash LLaMA3-V_pretrain.sh

And get:

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/pasteur/u/orrzohar/LLaVA-pp/LLaVA/llava/train/train_mem.py", line 1, in <module>
    from llava.train.train import train
  File "/pasteur/u/orrzohar/LLaVA-pp/LLaVA/llava/__init__.py", line 1, in <module>
    from .model import LlavaLlamaForCausalLM
  File "/pasteur/u/orrzohar/LLaVA-pp/LLaVA/llava/model/__init__.py", line 1, in <module>
    from .language_model.llava_llama import LlavaLlamaForCausalLM, LlavaConfig
  File "/pasteur/u/orrzohar/LLaVA-pp/LLaVA/llava/model/language_model/llava_llama.py", line 23, in <module>
    from transformers import AutoConfig, AutoModelForCausalLM, \
  File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
  File "/pasteur/u/orrzohar/miniconda3/envs/llavapp2/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1506, in __getattr__
    value = getattr(module, name)
  File "/pasteur/u/orrzohar/miniconda3/envs/llavapp2/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1505, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/pasteur/u/orrzohar/miniconda3/envs/llavapp2/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1517, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
/pasteur/u/orrzohar/miniconda3/envs/llavapp2/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZNK3c106SymIntltEl
[2024-04-30 15:59:00,176] [INFO] [launch.py:315:sigkill_handler] Killing subprocess 4141143
[2024-04-30 15:59:00,195] [INFO] [launch.py:315:sigkill_handler] Killing subprocess 4141144
[2024-04-30 15:59:00,195] [INFO] [launch.py:315:sigkill_handler] Killing subprocess 4141145
[2024-04-30 15:59:00,205] [INFO] [launch.py:315:sigkill_handler] Killing subprocess 4141146

Best,
Orr

@orrzohar orrzohar changed the title Installation Issue -- Installation Issue May 1, 2024
@mmaaz60
Copy link
Member

mmaaz60 commented May 1, 2024

Hi @orrzohar,

Thank you for your interest in our work. The issue you are encountering is related to the installation of flash attention. The flash-attention is not properly installed. The following would help.

  1. Make sure that the CUDA version of your machine and the CUDA version of PyTorch are same, and then try reinstalling the flash-attention.
  2. If it does not solve the issue, try to install the Flash Attention from source following the instructions below,
git clone https://github.com/HazyResearch/flash-attention.git
cd flash-attention
python setup.py install

I hope it will solve this issue. Good Luck!

@orrzohar
Copy link
Author

orrzohar commented May 1, 2024

Hi @mmaaz60,

Thank you for your reply, I'll try this - I used the command in the LLaVA documentation: pip install flash-attn --no-build-isolation.

I'll update if this works,

Best,
Orr

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants