Skip to content

What is the intended way of saving the base model when finetuning both an adapter and some layers in the base model? #1546

Answered by younesbelkada
samedii asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @samedii
in order to save both the adapter weights and some layers of the base model, that I assume are trainable, you need to declare a modules_to_save variable in your PeftConfig and peft will automatically take care of saving / loading correct modules.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@younesbelkada
Comment options

Answer selected by samedii
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants