Skip to content

Finetuning with adaptors #3136

Closed Answered by hiyouga
cosminroger asked this question in Q&A
Discussion options

You must be logged in to vote

When you select an adapter from the dropdown menu and fine-tune it on a new dataset, there would be two situations: If you selected create_new_adapter in the lora tab, the newly trained adapter does not contain the knowledge of the previous dataset, you need to specify both the adapters to infer the model. Otherwise, the newly trained adapter contains all the knowledge in the datasets, you just need to use the last adapter to infer the model.

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by cosminroger
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
solved This problem has been already solved.
2 participants