Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to Export Chatglm3 Model to ONNX Format in Optimum #1812

Open
2 of 4 tasks
Harini-Vemula-2382 opened this issue Apr 12, 2024 · 2 comments
Open
2 of 4 tasks

Unable to Export Chatglm3 Model to ONNX Format in Optimum #1812

Harini-Vemula-2382 opened this issue Apr 12, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@Harini-Vemula-2382
Copy link

System Info

Optimum Version: 1.18.0
Python Version: 3.8
Platform: Windows, x86_64

Who can help?

@michaelbenayoun @JingyaHuang @echarlaix
I am writing to report an issue I encountered while attempting to export a Chatglm3 model to ONNX format using Optimum.

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction (minimal, reproducible, runnable)

optimum-cli export onnx --model THUDM/chatglm3-6b chatglm_optimum_onnx/ --trust-remote-code

Expected behavior

I would expect Optimum to successfully export the chatglm3 model to ONNX format without encountering any errors or issues.

@Harini-Vemula-2382 Harini-Vemula-2382 added the bug Something isn't working label Apr 12, 2024
@fxmarty
Copy link
Collaborator

fxmarty commented Apr 16, 2024

Hi, THUDM/chatglm3-6b is not a Transformers model and the export is expected to fail. Could you share your export log here?

@Harini-Vemula-2382
Copy link
Author

"optimum-cli export onnx --model THUDM/chatglm3-6b --framework pt --task text-generation-with-past ./chatglm_onnx --trust-remote-code"

Above command line is used to export models in ONNX format, but I got error like

"ValueError: Trying to export a chatglm model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as custom_onnx_configs. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type chatglm to be supported natively in the ONNX export."

Actually I am not getting how to do ONNX export, if possible can you provide me steps to follow to export and if available share any scripts to download .

Here is a snippet of export log
export_log1
export_log2

Kindly help me in this regards, let me know if you have any concerns

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants