Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TNN convert & Android inference Error: can't infer resource shape from binary param in benchmark mode, random generator may not be exactly same with the real resource. Segmentation fault. #1949

Open
Gabriel819 opened this issue Aug 2, 2023 · 3 comments

Comments

@Gabriel819
Copy link

Gabriel819 commented Aug 2, 2023

1. 环境(environment)

  • Build OS and Version: Ubuntu 20.04
  • RunTime OS Version: Android
  • RunTime DEVICE: ARM

I'm trying to convert a model I made myself to tnn file.

  1. I converted my model to onnx to tnn using convert.py on TNN/tools/convert2tnn and the result is <model_name>.opt.tnnproto, not <model_name>.tnnproto. What is the meaning of the opt here?
    I got 'onnx to tnn convert success' message but is there something wrong?

  2. And after putting this tnn model in benchmark/benchmark-model, I tried running benchmark/benchmark_android/benchmark_models.sh, and I got an error.

2023-08-02 20:27:30 968: E source/tnn/optimizer/graph_matcher/ir.cc:230 Found unknown blob [backbone.blocks.0.norm1.weight] at Node [/backbone/blocks.0/norm1/Add_1] E/tnn: virtual tnn::Status tnn::optimizer::NetOptimizerConvertMatMulToConv::Optimize(tnn::NetStructure *, tnn::NetResource *) [File source/tnn/optimizer/net_optimizer_convert_matmul_to_conv.cc][Line 77] code: 0x1000 msg: source/tnn/optimizer/graph_matcher/ir.cc:230 Found unknown blob [backbone.blocks.0.norm1.weight] at Node [/backbone/blocks.0/norm1/Add_1]E/tnn: virtual tnn::Status tnn::BinaryLayerResourceGenerator::GenLayerResource(tnn::LayerParam *, tnn::LayerResource **, std::vector<Blob *> &) [File source/tnn/interpreter/layer_resource_generator.cc][Line 399] [WARNNING] can't infer resource shape from binary param in benchmark mode, random generator may not be exactly same with the real resource! E/tnn: virtual tnn::Status tnn::BinaryLayerResourceGenerator::GenLayerResource(tnn::LayerParam *, tnn::LayerResource **, std::vector<Blob *> &) [File source/tnn/interpreter/layer_resource_generator.cc][Line 399] [WARNNING] can't infer resource shape from binary param in benchmark mode, random generator may not be exactly same with the real resource! E/tnn: virtual tnn::Status tnn::MatMulLayerResourceGenerator::GenLayerResource(tnn::LayerParam *, tnn::LayerResource **, std::vector<Blob *> &) [File source/tnn/interpreter/layer_resource_generator.cc][Line 544] [WARNNING] can't infer resource shape from MatMul param in benchmark mode, random generator may not be exactly same with the real resource! E/tnn: virtual tnn::Status tnn::BinaryLayerResourceGenerator::GenLayerResource(tnn::LayerParam *, tnn::LayerResource **, std::vector<Blob *> &) [File source/tnn/interpreter/layer_resource_generator.cc][Line 399] [WARNNING] can't infer resource shape from binary param in benchmark mode, random generator may not be exactly same with the real resource! Segmentation fault E/tnn: tnn::Status tnn::OpenCLRuntime::Init() [File source/tnn/device/opencl/opencl_runtime.cc][Line 205] load program cache skipped, ret: 40966, msg: code: 0xA006 msg: open program cache file failed, input path: /data/local/tmp//d1_tnn_ocl_fd8c6f613ff9c0d503dbc462bf21353f_66e6f26f5f12a3349f451b682262ebbb_arm E/tnn: virtual tnn::Status tnn::BinaryLayerResourceGenerator::GenLayerResource(tnn::LayerParam *, tnn::LayerResource **, std::vector<Blob *> &) [File source/tnn/interpreter/layer_resource_generator.cc][Line 399] [WARNNING] can't infer resource shape from binary param in benchmark mode, random generator may not be exactly same with the real resource! E/tnn: virtual tnn::Status tnn::BinaryLayerResourceGenerator::GenLayerResource(tnn::LayerParam *, tnn::LayerResource **, std::vector<Blob *> &) [File source/tnn/interpreter/layer_resource_generator.cc][Line 399] [WARNNING] can't infer resource shape from binary param in benchmark mode, random generator may not be exactly same with the real resource! E/tnn: virtual tnn::Status tnn::MatMulLayerResourceGenerator::GenLayerResource(tnn::LayerParam *, tnn::LayerResource **, std::vector<Blob *> &) [File source/tnn/interpreter/layer_resource_generator.cc][Line 544] [WARNNING] can't infer resource shape from MatMul param in benchmark mode, random generator may not be exactly same with the real resource! E/tnn: virtual tnn::Status tnn::BinaryLayerResourceGenerator::GenLayerResource(tnn::LayerParam *, tnn::LayerResource **, std::vector<Blob *> &) [File source/tnn/interpreter/layer_resource_generator.cc][Line 399] [WARNNING] can't infer resource shape from binary param in benchmark mode, random generator may not be exactly same with the real resource! Segmentation fault

This error keeps saying '[WARNNING] can't infer resource shape from binary param in benchmark mode, random generator may not be exactly same with the real resource!'.
The Error report has no clue which part of the model is wrong. What should be done to solve this problem?

@wb014
Copy link

wb014 commented Nov 23, 2023

same error
have you solved it?

@zhuzhu18
Copy link

我也出现了同样的问题,怎么解决

@zhuzhu18
Copy link

我也出现了同样的问题,怎么解决

我发现是pytorch模型在转换成ONNX时将部分Constant算子折叠了,导致在ONNX转成tnnmodel时常量算子丢失,缺失了部分算子,无法推理

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants