Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add DataLoader related parameters to fit() and predict() #2295

Open
wants to merge 7 commits into
base: master
Choose a base branch
from

Conversation

BohdanBilonoh
Copy link
Contributor

Checklist before merging this PR:

  • Mentioned all issues that this PR fixes or addresses.
  • Summarized the updates of this PR under Summary.
  • Added an entry under Unreleased in the Changelog.

Summary

Add torch.utils.data.DataLoader related parameters to fit() and predict() of TorchForecastingModel

Copy link

codecov bot commented Apr 9, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 93.77%. Comparing base (0a72bf6) to head (719b7ab).

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #2295      +/-   ##
==========================================
- Coverage   93.79%   93.77%   -0.02%     
==========================================
  Files         138      138              
  Lines       14288    14274      -14     
==========================================
- Hits        13401    13386      -15     
- Misses        887      888       +1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@madtoinou
Copy link
Collaborator

madtoinou commented May 6, 2024

Hi @BohdanBilonoh,

It looks great, however to make it easier to maintain and more exhaustive, I think that it would be great to just add an argument called dataloader_kwargs, then check that the argument explicitly used by Darts are not redundant/overwritten and then pass this argument down to the DataLoader constructor.

It will allow users to specify more than just prefetch_factor, persistent_workers and pin_memory, while limiting copy-pasting from other library documentation (putting a link to the torch.DataLoader page does sound like a good idea for this argument however)

PS: Apologies for taking so long with the review of this PR.

@joshua-xia
Copy link

Hi @BohdanBilonoh, Would you please add multiprocessing_context parameter for Dataloader, it is useful when we use multi-workers for dataloader, Thanks!

@joshua-xia
Copy link

Hi @BohdanBilonoh, Would you please add multiprocessing_context parameter for Dataloader, it is useful when we use multi-workers for dataloader, Thanks!

@BohdanBilonoh refer to #2375

@joshua-xia
Copy link

joshua-xia commented May 7, 2024

Hi @BohdanBilonoh, Would you please add multiprocessing_context parameter for Dataloader, it is useful when we use multi-workers for dataloader, Thanks!

@BohdanBilonoh refer to #2375

@BohdanBilonoh My bad, it is good idea from @madtoinou to add dataloader_kwargs to let user input dataloader parameters as wish freely, not need to support special multiprocessing_context parameter forcibly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants