Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove deprecated _aminmax operator #125995

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

Conversation

cyyever
Copy link
Collaborator

@cyyever cyyever commented May 11, 2024

@cyyever cyyever requested a review from jerryzh168 as a code owner May 11, 2024 05:52
Copy link

pytorch-bot bot commented May 11, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/125995

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (2 Unrelated Failures)

As of commit a03ccd5 with merge base fe0a36f (image):

UNSTABLE - The following jobs failed but were likely due to flakiness present on trunk and has been marked as unstable:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@cyyever cyyever requested a review from malfet May 11, 2024 05:53
@cyyever cyyever requested a review from mruberry as a code owner May 11, 2024 14:42
@cyyever cyyever marked this pull request as draft May 11, 2024 16:32
@cyyever cyyever force-pushed the aminmax branch 5 times, most recently from 44d155c to e4c98f5 Compare May 12, 2024 05:54
@cyyever cyyever marked this pull request as ready for review May 12, 2024 07:49
@cyyever cyyever requested a review from albanD as a code owner May 12, 2024 07:49
@cyyever cyyever added module: bc-breaking Related to a BC-breaking change topic: bc breaking topic category labels May 12, 2024
Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Whelp, let's see if it works.

Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Whelp, let's see if it works.

@ezyang
Copy link
Contributor

ezyang commented May 12, 2024

@pytorchbot merge

@pytorch-bot pytorch-bot bot added the ciflow/trunk Trigger trunk jobs on your pull request label May 12, 2024
@cyyever
Copy link
Collaborator Author

cyyever commented May 12, 2024

@ezyang Thank you, now I know how to update bc....

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

tinglvv pushed a commit to tinglvv/pytorch that referenced this pull request May 14, 2024
It has been deprecated for a long time.

Co-authored-by: Edward Z. Yang <ezyang@meta.com>
Pull Request resolved: pytorch#125995
Approved by: https://github.com/ezyang
@huydhn
Copy link
Contributor

huydhn commented May 16, 2024

@pytorchbot revert -m 'Sorry for reverting your change but we need to reland this after I get rid of all usage of _aminmax internally in Meta' -c ghfirst

Codewise, I have been able to replace all usage of _aminmax internally in Meta, but there are some JIT exported models that need to be re-exported. Once that's done, I will reland the change.

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a revert job. Check the current status here.
Questions? Feedback? Please reach out to the PyTorch DevX Team

@pytorchmergebot
Copy link
Collaborator

Reverting PR 125995 failed

Reason: Command git -C /home/runner/work/pytorch/pytorch revert --no-edit 0116ffae7f94f35a2c712e186a0b371959b68c64 returned non-zero exit code 1

Auto-merging aten/src/ATen/native/TensorCompare.cpp
Auto-merging aten/src/ATen/native/native_functions.yaml
Auto-merging test/allowlist_for_publicAPI.json
Auto-merging test/expect/HasDecompTest.test_has_decomposition.expect
Auto-merging test/forward_backward_compatibility/check_forward_backward_compatibility.py
CONFLICT (content): Merge conflict in test/forward_backward_compatibility/check_forward_backward_compatibility.py
Auto-merging test/test_reductions.py
error: could not revert 0116ffae7f9... Remove deprecated _aminmax operator (#125995)
hint: After resolving the conflicts, mark them with
hint: "git add/rm <pathspec>", then run
hint: "git revert --continue".
hint: You can instead skip this commit with "git revert --skip".
hint: To abort and get back to the state before "git revert",
hint: run "git revert --abort".
Details for Dev Infra team Raised by workflow job

huydhn added a commit to huydhn/pytorch that referenced this pull request May 16, 2024
pytorchmergebot added a commit that referenced this pull request May 16, 2024
This reverts commit 06d6bb4.

Reverted #126030 on behalf of https://github.com/huydhn due to Sorry for reverting your change but i need to revert it to avoid a diff train conflict with #125995.  Please help rebase and I will reland the change ([comment](#126030 (comment)))
pytorchmergebot referenced this pull request May 16, 2024
Fixes #126012.

`from` is a reserved keyword in Python, thus we can't make the C++ impl available with `from` as function parameter. This PR changes the name to `from_` and also adjusts the docs.

If we want to preserve backwards compatibility, we can leave the C++ name as-is and only fix the docs. However, `torch.can_cast(from_=torch.int, to=torch.int)` won't work then.

Pull Request resolved: #126030
Approved by: https://github.com/albanD
@huydhn
Copy link
Contributor

huydhn commented May 16, 2024

@pytorchbot revert -m 'Sorry for reverting your change but we need to reland this after I get rid of all usage of _aminmax internally in Meta' -c ghfirst

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a revert job. Check the current status here.
Questions? Feedback? Please reach out to the PyTorch DevX Team

@pytorchmergebot
Copy link
Collaborator

@cyyever your PR has been successfully reverted.

pytorchmergebot added a commit that referenced this pull request May 16, 2024
This reverts commit 0116ffa.

Reverted #125995 on behalf of https://github.com/huydhn due to Sorry for reverting your change but we need to reland this after I get rid of all usage of _aminmax internally in Meta ([comment](#125995 (comment)))
ZelboK pushed a commit to ZelboK/pytorch that referenced this pull request May 19, 2024
This reverts commit 06d6bb4.

Reverted pytorch#126030 on behalf of https://github.com/huydhn due to Sorry for reverting your change but i need to revert it to avoid a diff train conflict with pytorch#125995.  Please help rebase and I will reland the change ([comment](pytorch#126030 (comment)))
ZelboK pushed a commit to ZelboK/pytorch that referenced this pull request May 19, 2024
This reverts commit 0116ffa.

Reverted pytorch#125995 on behalf of https://github.com/huydhn due to Sorry for reverting your change but we need to reland this after I get rid of all usage of _aminmax internally in Meta ([comment](pytorch#125995 (comment)))
cyyever and others added 2 commits May 22, 2024 08:17
Signed-off-by: Edward Z. Yang <ezyang@meta.com>
@facebook-github-bot
Copy link
Contributor

@huydhn has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@huydhn
Copy link
Contributor

huydhn commented May 23, 2024

I have imported the PR and run internal tests. Code-wise, all instances of _aminmax have been updated. However, there are a handful of JIT-exported models that uses _aminmax. My understanding is that they would need to be re-exported to work. This won't be a quick task because they are owned by different teams.

cc @kit1980

@cyyever
Copy link
Collaborator Author

cyyever commented May 24, 2024

@huydhn Thank you, waiting for internal coordination in META.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants