Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Idefics2 Support in Optimum for ONNX export #1821

Open
gtx-cyber opened this issue Apr 19, 2024 · 5 comments
Open

Idefics2 Support in Optimum for ONNX export #1821

gtx-cyber opened this issue Apr 19, 2024 · 5 comments

Comments

@gtx-cyber
Copy link

Feature request

With reference to the new Idefics2 model- https://huggingface.co/HuggingFaceM4/idefics2-8b
I would like to export it to ONNX which is currently not possible.
Please enable conversion support. Current Error with pip install transformers via GIT

Traceback (most recent call last):
  File "/usr/local/bin/optimum-cli", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.10/dist-packages/optimum/commands/optimum_cli.py", line 163, in main
    service.run()
  File "/usr/local/lib/python3.10/dist-packages/optimum/commands/export/onnx.py", line 265, in run
    main_export(
  File "/usr/local/lib/python3.10/dist-packages/optimum/exporters/onnx/__main__.py", line 352, in main_export
    onnx_export_from_model(
  File "/usr/local/lib/python3.10/dist-packages/optimum/exporters/onnx/convert.py", line 1048, in onnx_export_from_model
    raise ValueError(
ValueError: Trying to export a idefics2 model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type idefics2 to be supported natively in the ONNX export.

Motivation

The model is good and would like to export it to onnx asap

Your contribution

@gtx-cyber
Copy link
Author

@fxmarty

@gtx-cyber
Copy link
Author

Please assist in this, it was essential

@matbee-eth
Copy link

It would be nice, I'm assuming we need to ensure both siglip and mistral 8b are both supported in onnx first

@gtx-cyber
Copy link
Author

gtx-cyber commented May 7, 2024

@amyeroberts Please have a look, A guidance on how I can proceed with the conversion of a custom model would help, I tried referring to the guide but wasnt too clear for a model like this- Multimodal

@amyeroberts
Copy link

Hi @gtx-cyber - thanks for your interesting in making this model onnx exportable! As @matbee-eth mentions, the first steps would be to make sure siglip and mistral 8b are exportable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants