You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It is possible to run a single fold using the min_folds_before_pruning=1 and prune_threshold=-np.infty.
This ensures the run will be always pruned after only one fold.
Then you can use the nfolds and val_folds to tune your train val split. For example nfolds=10, val_folds=1 (default) uses 90% of the data for training and 10% for validation. More in general the validation fraction is val_folds/nfolds, so for example with nfolds=10, val_folds=3 you can achieve a 70-30 split.
Allow to run a single fold instead of the standard k fold cross validation.
Also maybe return the trained model
see #59
The text was updated successfully, but these errors were encountered: