Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch.fill_ can not apply after add function #1920

Open
fukatani opened this issue Jul 22, 2023 · 1 comment 路 May be fixed by #1924
Open

torch.fill_ can not apply after add function #1920

fukatani opened this issue Jul 22, 2023 · 1 comment 路 May be fixed by #1924
Labels
bug Unexpected behaviour that should be corrected (type) PyTorch (traced) triaged Reviewed and examined, release as been assigned if applicable (status)

Comments

@fukatani
Copy link
Contributor

馃悶Describing the bug

  • torch.fill_ can not apply after add function

Maybe related to #1914 and we need more general solution.

Stack Trace

Model is not in eval mode. Consider calling '.eval()' on your model prior to conversion
Traceback (most recent call last):
  File "/Users/ryosukefukatani/work/coremltools/onth9.py", line 26, in <module>
    convert_to="neuralnetwork",
  File "/Users/ryosukefukatani/work/coremltools/coremltools/converters/_converters_entry.py", line 542, in convert
    main_pipeline=pass_pipeline,
  File "/Users/ryosukefukatani/work/coremltools/coremltools/converters/mil/converter.py", line 188, in mil_convert
    return _mil_convert(model, convert_from, convert_to, ConverterRegistry, MLModel, compute_units, **kwargs)
  File "/Users/ryosukefukatani/work/coremltools/coremltools/converters/mil/converter.py", line 217, in _mil_convert
    **kwargs
  File "/Users/ryosukefukatani/work/coremltools/coremltools/converters/mil/converter.py", line 286, in mil_convert_to_proto
    prog = frontend_converter(model, **kwargs)
  File "/Users/ryosukefukatani/work/coremltools/coremltools/converters/mil/converter.py", line 108, in __call__
    return load(*args, **kwargs)
  File "/Users/ryosukefukatani/work/coremltools/coremltools/converters/mil/frontend/torch/load.py", line 61, in load
    specification_version,
  File "/Users/ryosukefukatani/work/coremltools/coremltools/converters/mil/frontend/torch/converter.py", line 335, in __init__
    p(self.graph)
  File "/Users/ryosukefukatani/work/coremltools/coremltools/converters/mil/frontend/torch/torchir_passes.py", line 151, in generate_tensor_assignment_ops
    raise ValueError("No matching select or slice.")
ValueError: No matching select or slice.

To Reproduce

import torch
import coremltools as ct
import numpy as np


class Net(torch.nn.Module):
    def forward(self, x):
        y = torch.empty(x.shape).to(torch.int32) + 1
        y.fill_(0.0)
        return y


x = torch.rand(2, 3)
traced_fn = torch.jit.trace(Net(), x)
ct_model = ct.convert(
    traced_fn,
    inputs=[
        ct.TensorType(
            shape=(
                ct.RangeDim(),
                ct.RangeDim(),
            )
        ),
    ],
    source="pytorch",
    convert_to="neuralnetwork",
)

out = traced_fn(x)
out_dict = ct_model.predict(
    {
        'x': x.detach().numpy().astype(np.float32),
    }
)
np.testing.assert_allclose(out, list(out_dict.values())[0], rtol=0.001, atol=0.001)

System environment (please complete the following information):

  • coremltools version: latest master
@fukatani fukatani added the bug Unexpected behaviour that should be corrected (type) label Jul 22, 2023
@TobyRoseman TobyRoseman added triaged Reviewed and examined, release as been assigned if applicable (status) PyTorch (traced) labels Jul 25, 2023
@TobyRoseman
Copy link
Collaborator

We probably need a more general solution of #1917.

@fukatani fukatani linked a pull request Jul 27, 2023 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Unexpected behaviour that should be corrected (type) PyTorch (traced) triaged Reviewed and examined, release as been assigned if applicable (status)
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants