New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Idefics2 Support in Optimum for ONNX export #1821
Comments
Please assist in this, it was essential |
It would be nice, I'm assuming we need to ensure both siglip and mistral 8b are both supported in onnx first |
@amyeroberts Please have a look, A guidance on how I can proceed with the conversion of a custom model would help, I tried referring to the guide but wasnt too clear for a model like this- Multimodal |
Hi @gtx-cyber - thanks for your interesting in making this model onnx exportable! As @matbee-eth mentions, the first steps would be to make sure siglip and mistral 8b are exportable. |
Feature request
With reference to the new Idefics2 model- https://huggingface.co/HuggingFaceM4/idefics2-8b
I would like to export it to ONNX which is currently not possible.
Please enable conversion support. Current Error with pip install transformers via GIT
Motivation
The model is good and would like to export it to onnx asap
Your contribution
The text was updated successfully, but these errors were encountered: