Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for single fold run #62

Open
AlessandroLovo opened this issue Feb 27, 2023 · 1 comment
Open

Support for single fold run #62

AlessandroLovo opened this issue Feb 27, 2023 · 1 comment
Labels
enhancement New feature or request

Comments

@AlessandroLovo
Copy link
Collaborator

Allow to run a single fold instead of the standard k fold cross validation.
Also maybe return the trained model

see #59

@AlessandroLovo AlessandroLovo added the enhancement New feature or request label Feb 27, 2023
@AlessandroLovo
Copy link
Collaborator Author

AlessandroLovo commented Jul 27, 2023

It is possible to run a single fold using the min_folds_before_pruning=1 and prune_threshold=-np.infty.
This ensures the run will be always pruned after only one fold.
Then you can use the nfolds and val_folds to tune your train val split. For example nfolds=10, val_folds=1 (default) uses 90% of the data for training and 10% for validation. More in general the validation fraction is val_folds/nfolds, so for example with nfolds=10, val_folds=3 you can achieve a 70-30 split.

This doesn't return the trained model though

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant