Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Llava ONNX export #1813

Open
2 of 4 tasks
Harini-Vemula-2382 opened this issue Apr 12, 2024 · 1 comment
Open
2 of 4 tasks

Support Llava ONNX export #1813

Harini-Vemula-2382 opened this issue Apr 12, 2024 · 1 comment
Labels
feature-request New feature or request onnx Related to the ONNX export

Comments

@Harini-Vemula-2382
Copy link

System Info

Optimum Version: 1.18.0
Python Version: 3.8
Platform: Windows, x86_64

Who can help?

@michaelbenayoun @JingyaHuang @echarlaix
I am writing to report an issue I encountered while attempting to export a Llava-1.5-7b model to ONNX format using Optimum.

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction (minimal, reproducible, runnable)

optimum-cli export onnx --model liuhaotian/llava-v1.5-7b llava_optimum_onnx/ --trust-remote-code

Expected behavior

I would expect Optimum to successfully export the Llava-1.5-7b model to ONNX format without encountering any errors or issues.

@Harini-Vemula-2382 Harini-Vemula-2382 added the bug Something isn't working label Apr 12, 2024
@fxmarty
Copy link
Collaborator

fxmarty commented Apr 16, 2024

@Harini-Vemula-2382 Thank you. Llava ONNX export is not yet supported. A PR is open: #1790

The error you get is likely

  File "/home/felix/transformers/src/transformers/models/auto/auto_factory.py", line 566, in from_pretrained
    raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers.models.llava.configuration_llava.LlavaConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, LlamaConfig, CodeGenConfig, CohereConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, FalconConfig, FuyuConfig, GemmaConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MambaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MistralConfig, MixtralConfig, MptConfig, MusicgenConfig, MusicgenMelodyConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PersimmonConfig, PhiConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, Qwen2MoeConfig, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, StableLmConfig, Starcoder2Config, TransfoXLConfig, TrOCRConfig, WhisperConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.

which stems from wrong task label on the Hugging Face Hub: https://huggingface.co/liuhaotian/llava-v1.6-34b/discussions/11 & https://huggingface.co/datasets/huggingface/transformers-metadata/blob/main/pipeline_tags.json#L440-L441

@fxmarty fxmarty changed the title Issue Report: Unable to Export Llava 1.5 7b Model to ONNX Format in Optimum Support Llava ONNX export Apr 16, 2024
@fxmarty fxmarty added feature-request New feature or request onnx Related to the ONNX export and removed bug Something isn't working labels Apr 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature-request New feature or request onnx Related to the ONNX export
Projects
None yet
Development

No branches or pull requests

2 participants