Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Pix2Pix] Use fit_generator to speed up training process #247

Open
wave-transmitter opened this issue Jan 27, 2021 · 0 comments
Open

[Pix2Pix] Use fit_generator to speed up training process #247

wave-transmitter opened this issue Jan 27, 2021 · 0 comments

Comments

@wave-transmitter
Copy link

Hello GAN-comrades,

in this implementation train_on_batch is used to train the network, which seems the best option since it is required a two-step process of training separately generator and discriminator during an iteration.

            # ---------------------
            #  Train Discriminator
            # ---------------------

            # Condition on B and generate a translated version
            fake_A = self.generator.predict(imgs_B)

            # Train the discriminators (original images = real / generated = Fake)
            d_loss_real = self.discriminator.train_on_batch([imgs_A, imgs_B], valid)
            d_loss_fake = self.discriminator.train_on_batch([fake_A, imgs_B], fake)
            d_loss = 0.5 * np.add(d_loss_real, d_loss_fake)

            # -----------------
            #  Train Generator
            # -----------------

            # Train the generators
            g_loss = self.combined.train_on_batch([imgs_A, imgs_B], [valid, imgs_A])

One the other hand, fit_generator allows to speed up the process and deal with the CPU bottleneck of data preprocessing via the workers argument. I was wondering if fit_generator could be used somehow in our case.

To be honest I cannot imagine how this can be done, since generator and discriminator must be alternating trained per batch. Any ideas or tips how to implement a fit_generator approach or somehow employ more cpu workers for training?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant