You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a model exported by pytorch which was supported until ort 1.16.3 but now fails with ort 1.17.0: onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (MatMulBnFusion_Gemm) Op (Gemm) [ShapeInferenceError] First input does not have rank 2
I have tried several versions of pytorch, several config of torch.onnx.export but always fail. As soon as I downgrade to 1.16.3, it works whatever the pytorch version is, so that makes me think that ort is the culprit.
Plus I've run onnx.checker.check_model(mymodel, full_check=True) and raised no issue.
It happens even with pure CPU environment.
The following code passes with ort 1.16.3 but returns an error with 1.17.0
To reproduce
import onnx
import onnxruntime as ort
file = 'example_onnx_file.onnx'
mymodel = onnx.load(file)
onnx.checker.check_model(mymodel, full_check=True) # No error
ort.InferenceSession(file, providers=['CPUExecutionProvider']) # Raise an error
Urgency
No response
Platform
Linux
OS Version
Debian stable
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.17.0
ONNX Runtime API
Python
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered:
terminate called after throwing an instance of 'Ort::Exception'
what(): Node (Loop_5471) Op (Loop)
[TypeInferenceError] Graph attribute inferencing failed:
Node (Concat_5490) Op (Concat) [
ShapeInferenceError] All inputs to Concat must have same rank. Input 1 has rank 2 != 1
(Note that the only change is to use a newer version of onnxruntime. All other things are kept the same.)
Describe the issue
I have a model exported by pytorch which was supported until ort 1.16.3 but now fails with ort 1.17.0:
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (MatMulBnFusion_Gemm) Op (Gemm) [ShapeInferenceError] First input does not have rank 2
I have tried several versions of pytorch, several config of torch.onnx.export but always fail. As soon as I downgrade to 1.16.3, it works whatever the pytorch version is, so that makes me think that ort is the culprit.
Plus I've run
onnx.checker.check_model(mymodel, full_check=True)
and raised no issue.It happens even with pure CPU environment.
The following code passes with ort 1.16.3 but returns an error with 1.17.0
To reproduce
Urgency
No response
Platform
Linux
OS Version
Debian stable
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.17.0
ONNX Runtime API
Python
Architecture
X64
Execution Provider
Default CPU
Execution Provider Library Version
No response
The text was updated successfully, but these errors were encountered: