Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ExecuTorch] Placeholder Assertion Error #2199

Open
YifanShenSZ opened this issue Apr 16, 2024 · 1 comment
Open

[ExecuTorch] Placeholder Assertion Error #2199

YifanShenSZ opened this issue Apr 16, 2024 · 1 comment
Labels
bug Unexpected behaviour that should be corrected (type) ExecuTorch triaged Reviewed and examined, release as been assigned if applicable (status)

Comments

@YifanShenSZ
Copy link
Collaborator

This toy model fails to export in ExecuTorch

        model = ModuleWrapper(
            function=nn.functional.scaled_dot_product_attention,
            kwargs={
                "attn_mask": None,
                "is_causal": True,
            },
        )

due to

    def assert_functional_graph(fx_g: torch.fx.Graph) -> int:
        placeholders = set()
        copy_count = 0
        # NB: It would also be nice to verify that the mutations all happen at the
        # end, but we also do some administrative views after mutations so this
        # isn't actually true.  (TODO: Could this cause problems for Inductor?)
        for n in fx_g.nodes:
            if n.op == "placeholder":
                placeholders.add(n)
            if isinstance(n.target, torch._ops.OpOverload):
                if n.target is torch.ops.aten.copy_.default:
                    suffix = True
                    # Can only copy_ into an input, and can only do so once
>                   assert n.args[0] in placeholders
E                   AssertionError
@YifanShenSZ YifanShenSZ added bug Unexpected behaviour that should be corrected (type) triaged Reviewed and examined, release as been assigned if applicable (status) ExecuTorch labels Apr 16, 2024
@teelrabbit
Copy link
Contributor

Seems to be an issue with the scaled_dot_product_attention function in coremltools/converters/mil/frontend/torch/ops.py. I created a unit test and it looks like the shape that is being generated is (<insert_size,) instead of both arguments. This is my unit test https://pastes.dev/lOL84mawjJ and this was the output https://pastes.dev/dA3Dn6YvV8

╰─ python assertion-error-test.py                                                                                 (coremltools-env)
2024-04-20 20:19:23.985490: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: SSE4.1 SSE4.2, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/usr/local/Caskroom/miniconda/base/envs/coremltools-env/lib/python3.9/site-packages/tensorflow/python/keras/engine/training_arrays_v1.py:37: UserWarning: A NumPy version >=1.22.4 and <1.29.0 is required for this version of SciPy (detected version 1.22.0)
  from scipy.sparse import issparse  # pylint: disable=g-import-not-at-top
Torch version 2.2.2 has not been tested with coremltools. You may run into unexpected errors. Torch 2.1.0 is the most recent version that has been tested.
Actual output shape: torch.Size([32])
F
======================================================================
FAIL: test_forward (__main__.TestModuleWrapper)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/F.Bueller/Documents/GitHub/sl/coremltools/coremltools/converters/mil/frontend/torch/test/assertion-error-test.py", line 32, in test_forward
    self.assertEqual(output.shape, (10, 32))
AssertionError: torch.Size([32]) != (10, 32)

----------------------------------------------------------------------
Ran 1 test in 0.002s

FAILED (failures=1)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Unexpected behaviour that should be corrected (type) ExecuTorch triaged Reviewed and examined, release as been assigned if applicable (status)
Projects
None yet
Development

No branches or pull requests

2 participants