You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm not aware of Qoperator support in AIMET. However, you can obtain QDQ format in your ONNX using use_embedded_encodings=true with AIMET's ONNX export feature. If you're unfamiliar with QDQ format, you can find more information in this link
PS: Please note that AIMET QDQ format is supported only for int8 quantization (W8A8) due to limitation in the ONNX opset version related to the Torch version (1.13) of AIMET
How to get a Qoperator format ONNX model after quantization?
The text was updated successfully, but these errors were encountered: