New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot export jinaai models to onnx format because the model is > 2Gb #1800
Comments
Hi @clarinevong, I can not reproduce the issue on Linux, this is likely a PyTorch x Windows bug. I would recommend opening a bug report in PyTorch repo (although the torchscript issues are a bit deprecated these days as the effort is moving to dynamo) Related: microsoft/onnxscript#493 Which PyTorch version are you using? This looks to me to be a bug here with |
Thank you for giving a try on Linux! I still can not reproduce, using python 3.10.14 and
Could you share your |
Yes of course
|
System Info
Who can help?
@michaelbenayoun @JingyaHuang @echarlaix
I am writing to report an issue I encountered while attempting to export a jinaai model to ONNX format using Optimum.
Error message
RuntimeError: The serialized model is larger than the 2GiB limit imposed by the protobuf library. Therefore the output file must be a file path, so that the ONNX external data can be written to the same directory. Please specify the output file name.
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction (minimal, reproducible, runnable)
optimum-cli export onnx -m jinaai/jina-embeddings-v2-base-en jina-embeddings-v2-base-en-onnx --trust-remote-code
Expected behavior
I would expect Optimum to successfully export the jinaai model to ONNX format without encountering any errors or issues.
The text was updated successfully, but these errors were encountered: