Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I want to get the“hidden_states” data in the Bert model #20664

Open
dogdogpp opened this issue May 13, 2024 · 3 comments
Open

I want to get the“hidden_states” data in the Bert model #20664

dogdogpp opened this issue May 13, 2024 · 3 comments
Labels
feature request request for unsupported feature or enhancement model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc.

Comments

@dogdogpp
Copy link

Describe the feature request

I'm running a TTS project where the Bert model needs to get the“hidden_states” component of the middle layer output for later processing. I'm using C + + and can't seem to find a way to get the middle layer output, any suggestions?
1715582733737

Describe scenario use case

This method is handy for using the Bert Model, because hidden_states are also useful features, which can be implemented in the Python version of Transformer, but is difficult to implement in C + + .

@dogdogpp dogdogpp added the feature request request for unsupported feature or enhancement label May 13, 2024
@github-actions github-actions bot added the model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc. label May 13, 2024
@tianleiwu
Copy link
Contributor

You can add "hidden_states" to model output, and export the model to onnx so that onnx model also has the output.

For example, in huggingface transformers, you can pass pass output_hidden_states=True so that the hidden states is output.
https://github.com/huggingface/transformers/blob/37bba2a32d2742a10216ffd925bb8f145a732ce1/src/transformers/models/megatron_bert/modeling_megatron_bert.py#L527

@dogdogpp
Copy link
Author

dogdogpp commented May 13, 2024 via email

@dogdogpp
Copy link
Author

Yes. However, the transformer only has a python version, and what I need to achieve is to get this data in the C++ version, and I can't use the transformer. Is there a way to make it happen, thanks!

您可以在模型输出中添加“hidden_states”,并将模型导出到 onnx,以便 onnx 模型也具有输出。

例如,在 huggingface transformers 中,您可以传递 pass,以便输出隐藏状态。 https://github.com/huggingface/transformers/blob/37bba2a32d2742a10216ffd925bb8f145a732ce1/src/transformers/models/megatron_bert/modeling_megatron_bert.py#L527`output_hidden_states=True`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request request for unsupported feature or enhancement model:transformer issues related to a transformer model: BERT, GPT2, Hugging Face, Longformer, T5, etc.
Projects
None yet
Development

No branches or pull requests

2 participants