Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.export code that includes torch.autograd.grad #125984

Open
Linux-cpp-lisp opened this issue May 11, 2024 · 3 comments
Open

torch.export code that includes torch.autograd.grad #125984

Linux-cpp-lisp opened this issue May 11, 2024 · 3 comments
Assignees
Labels
oncall: export oncall: pt2 triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@Linux-cpp-lisp
Copy link

Linux-cpp-lisp commented May 11, 2024

馃悰 Describe the bug

The following minimal example (based on a large real-world model which fails the same way) fails with errors in torch.export:

import torch

class F(torch.nn.Module):
    def forward(self, x):
        y = x.square()
        return torch.autograd.grad(y.sum(), x)

f = F()
x = torch.ones(3)
x.requires_grad_(True)
print(f(x))

f = F()
x = torch.ones(3)
x.requires_grad_(True)
print(torch.export.export(f, args=(x,), strict=False))

printing (omitting backtraces):

(tensor([2., 2., 2.]),)
[...]
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

With strict=True, it errors as

Unsupported: 'skip function grad in file site-packages/torch/autograd/__init__.py'

Is exporting through a module that includes internal calls to the autograd engine a supported usecase for torch.export (and other related APIs, like torch._export.aot_compile)?

Error logs

No response

Minified repro

No response

Versions

PyTorch 2.3.0

cc @ezyang @msaroufim @bdhirsh @anijain2305 @chauhang @avikchaudhuri @gmagogsfm @zhxchen17 @tugsbayasgalan @angelayi @suo @ydwu4

@bdhirsh bdhirsh added oncall: export triage review triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module and removed triage review labels May 13, 2024
@bdhirsh
Copy link
Contributor

bdhirsh commented May 14, 2024

from @avikchaudhuri : seems reasonable to have a better error message for export when people use autograd.grad or .backward() directly in the code they are exporting

@Linux-cpp-lisp
Copy link
Author

@bdhirsh does that mean that code containing auto-differentiation is not supported for torch.export? (Is this in the roadmap, or not intended to work?)

@tugsbayasgalan tugsbayasgalan self-assigned this May 21, 2024
@tugsbayasgalan
Copy link
Contributor

cc: @JackCaoG for the dynamo tracing through autograd.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
oncall: export oncall: pt2 triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

3 participants