Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

the loss is nan when pre-training tinyllama using share recipe #13

Open
xushilin1 opened this issue Mar 4, 2024 · 1 comment
Open

Comments

@xushilin1
Copy link

Here is my training script

deepspeed tinyllava/train/train.py \
    --deepspeed ./scripts/zero2.json \
    --model_name_or_path checkpoints/TinyLlama-1.1B-Chat-v1.0/ \
    --version plain \
    --data_path datasets/LLaVA-Pretrain/blip_laion_cc_sbu_558k.json \
    --image_folder datasets/LLaVA-Pretrain/images \
    --vision_tower checkpoints/clip-vit-large-patch14-336 \
    --pretrain_mm_mlp_adapter output/pretrain/llava-tinyllama-1.1b/mm_projector.bin \
    --mm_projector_type mlp2x_gelu \
    --tune_entire_model True \
    --tune_vit_from_layer 12 \
    --mm_vision_select_layer -2 \
    --mm_use_im_start_end False \
    --mm_use_im_patch_token False \
    --bf16 True \
    --output_dir output/pretrain/llava-tinyllama-1.1b_share \
    --num_train_epochs 1 \
    --per_device_train_batch_size 32 \
    --per_device_eval_batch_size 4 \
    --gradient_accumulation_steps 1 \
    --evaluation_strategy "no" \
    --save_strategy "steps" \
    --save_steps 24000 \
    --save_total_limit 1 \
    --learning_rate 2e-5 \
    --weight_decay 0. \
    --warmup_ratio 0.03 \
    --lr_scheduler_type "cosine" \
    --logging_steps 1 \
    --tf32 True \
    --model_max_length 2048 \
    --gradient_checkpointing True \
    --dataloader_num_workers 4 \
    --lazy_preprocess True
@tsw123678
Copy link

maybe you should try deepspeed with a lower version like 0.10?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants