Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA out of memory #6

Open
integrum-aiktuck opened this issue May 19, 2023 · 0 comments
Open

CUDA out of memory #6

integrum-aiktuck opened this issue May 19, 2023 · 0 comments

Comments

@integrum-aiktuck
Copy link

integrum-aiktuck commented May 19, 2023

你好,我正在使用 T4, 16GB 的 GPU 来做 fine tuning。 但是 一直遇到 CUDA out of memory error. 请问您是用什么 parameters 来做 fine tuning 的呢?多谢!

MICRO_BATCH_SIZE = 4 # this could actually be 5 but i like powers of 2
BATCH_SIZE = 8
MAX_STEPS = None
GRADIENT_ACCUMULATION_STEPS = BATCH_SIZE // MICRO_BATCH_SIZE
EPOCHS = 3 # we don't always need 3 tbh
LEARNING_RATE = 3e-4 # the Karpathy constant
CUTOFF_LEN = 256 # 256 accounts for about 96% of the data
LORA_R = 8
LORA_ALPHA = 16
LORA_DROPOUT = 0.05
VAL_SET_SIZE = args.test_size # 2000
TARGET_MODULES = [
"q_proj",
"v_proj",
]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant