Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Is it possible for adding hparams to model.default_cfg? #1959

Open
luckyhug opened this issue Sep 20, 2023 · 0 comments
Open

[FEATURE] Is it possible for adding hparams to model.default_cfg? #1959

luckyhug opened this issue Sep 20, 2023 · 0 comments
Labels
enhancement New feature or request

Comments

@luckyhug
Copy link

Is your feature request related to a problem? Please describe.
When I search for some model: https://huggingface.co/timm/convnext_large_mlp.clip_laion2b_augreg_ft_in1k_384, the model card said it was fine-tuned on ImageNet-1k in timm by Ross Wightman. Though it is directed to some more details on pretrain, the hparams for this finetuning process are hard to find.

Describe the solution you'd like
Maybe we could add the hparams in model.finetune_cfg to provide more useful information?

Describe alternatives you've considered
or maybe the args.yaml file can be provided or linked to the model card?

Additional context
Thank you very much! I found some convnext hparams on https://gist.github.com/rwightman/ee0b02c1e99a0761264d1d1319e95e5b
but only for nano and atto, I'm not sure if they are still a strong hparams for finetuning large models? Should I start my sweep based on these much smaller models hparams?

@luckyhug luckyhug added the enhancement New feature or request label Sep 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant