Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parameters of cloned learner is not updating #420

Open
xunil17 opened this issue Oct 28, 2023 · 1 comment
Open

Parameters of cloned learner is not updating #420

xunil17 opened this issue Oct 28, 2023 · 1 comment

Comments

@xunil17
Copy link

xunil17 commented Oct 28, 2023

Hello, I am running the toy_example.py and printing out the loss and adapt_loss as shown below.

    for t in range(TASKS_PER_STEP):
            # Sample a task
            task_params = task_dist.sample()
            mu_i, sigma_i = task_params[:DIM], task_params[DIM:]

            # Adaptation: Instanciate a copy of model
            learner = maml.clone()
            proposal = learner()

            # Adaptation: Compute and adapt to task loss
            loss = (mu_i - proposal.mean).pow(2).sum() + (sigma_i - proposal.variance).pow(2).sum()
            learner.adapt(loss)
            print(loss)

            # Adaptation: Evaluate the effectiveness of adaptation
            adapt_loss = (mu_i - proposal.mean).pow(2).sum() + (sigma_i - proposal.variance).pow(2).sum()
            print(adapt_loss)

            # Accumulate the error over all tasks
            step_loss += adapt_loss

However, I'm receiving the same adapt_loss and loss at every timestep. I'm also looking at the parameters inside the maml, proposal, and learner. I see that the proposal and maml have the same parameters but the learner.module._parameters are different. Am I understanding that the proposal and maml model should have different parameters after learner.adapt is called?

Thank you so much!

@xunil17 xunil17 changed the title Parameters of cloned learner is not updating parameters Parameters of cloned learner is not updating Oct 28, 2023
@ImahnShekhzadeh
Copy link

ImahnShekhzadeh commented Nov 20, 2023

However, I'm receiving the same adapt_loss and loss at every timestep.

This is weird, indeed you should receive different loss values. Maybe you can proposal by learner() everywhere and see what happens?

I see that the proposal and maml have the same parameters but the learner.module._parameters are different.

Where exactly are you doing the print statement?

Am I understanding that the proposal and maml model should have different parameters after learner.adapt is called?

I would expect that too, yes..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants