Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

problem with converted custom model at TensorFlow Lite #67086

Open
rkdtmdals7710 opened this issue May 7, 2024 · 2 comments
Open

problem with converted custom model at TensorFlow Lite #67086

rkdtmdals7710 opened this issue May 7, 2024 · 2 comments
Assignees
Labels
comp:lite TF Lite related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author TFLiteConverter For issues related to TFLite converter type:support Support issues

Comments

@rkdtmdals7710
Copy link

I created a tflite model from yolov5n [export.py -- weights mymodel.pt -- include tflite] for the learning model I learned from yolov5n. By the way, /home/pi1/Desktop/project3/examples/lite/examples/object_detection/raspberry_pi. It works fine when I run the existing detect.py , but when I use the above model, I get the following error.

(0507) pi1@raspberrypi:~/Desktop/project3/examples/lite/examples/object_detection/raspberry_pi $ python detect.py --model mymodel.tflite
Traceback (most recent call last):
File "detect.py", line 15, in
import tensorflow_io as tfio
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/init.py", line 17, in
from tensorflow_io.python.api import * # pylint: disable=wildcard-import
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/python/api/init.py", line 19, in
from tensorflow_io.python.ops.io_dataset import IODataset
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/python/ops/init.py", line 24, in
import tensorflow as tf
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow/init.py", line 37, in
from tensorflow.python.tools import module_util as _module_util
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow/python/init.py", line 37, in
from tensorflow.python.eager import context
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow/python/eager/context.py", line 28, in
from tensorflow.core.framework import function_pb2
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow/core/framework/function_pb2.py", line 16, in
from tensorflow.core.framework import attr_value_pb2 as tensorflow_dot_core_dot_framework_dot_attr__value__pb2
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow/core/framework/attr_value_pb2.py", line 16, in
from tensorflow.core.framework import tensor_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__pb2
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow/core/framework/tensor_pb2.py", line 16, in
from tensorflow.core.framework import resource_handle_pb2 as tensorflow_dot_core_dot_framework_dot_resource__handle__pb2
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow/core/framework/resource_handle_pb2.py", line 16, in
from tensorflow.core.framework import tensor_shape_pb2 as tensorflow_dot_core_dot_framework_dot_tensor__shape__pb2
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow/core/framework/tensor_shape_pb2.py", line 42, in
serialized_options=None, file=DESCRIPTOR),
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/google/protobuf/descriptor.py", line 561, in new
_message.Message._CheckCalledFromGeneratedFile()
TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:

  1. Downgrade the protobuf package to 3.20.x or lower.
  2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).

More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates

When I downgrade protobuf, I get the following error. Please solve the problem.

(0507) pi1@raspberrypi:~/Desktop/project3/examples/lite/examples/object_detection/raspberry_pi $ python detect.py --model mymodel.tflite
/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/python/ops/init.py:98: UserWarning: unable to load libtensorflow_io_plugins.so: unable to open file: libtensorflow_io_plugins.so, from paths: ['/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/python/ops/libtensorflow_io_plugins.so']
caused by: ['/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/python/ops/libtensorflow_io_plugins.so: undefined symbol: _ZN10tensorflow8internal15LogMessageFatalC1EPKci']
warnings.warn(f"unable to load libtensorflow_io_plugins.so: {e}")
/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/python/ops/init.py:104: UserWarning: file system plugins are not loaded: unable to open file: libtensorflow_io.so, from paths: ['/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/python/ops/libtensorflow_io.so']
caused by: ['/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_io/python/ops/libtensorflow_io.so: undefined symbol: _ZN10tensorflow4data11DatasetBase8FinalizeEPNS_15OpKernelContextESt8functionIFNS_8StatusOrISt10unique_ptrIS1_NS_4core15RefCountDeleterEEEEvEE']
warnings.warn(f"file system plugins are not loaded: {e}")
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Traceback (most recent call last):
File "detect.py", line 152, in
main()
File "detect.py", line 148, in main
int(args.numThreads), bool(args.enableEdgeTPU))
File "detect.py", line 65, in run
detector = vision.ObjectDetector.create_from_options(options)
File "/home/pi1/miniconda3/envs/0507/lib/python3.7/site-packages/tensorflow_lite_support/python/task/vision/object_detector.py", line 91, in create_from_options
options.base_options.to_pb2(), options.detection_options.to_pb2())
RuntimeError: Input tensor has type kTfLiteFloat32: it requires specifying NormalizationOptions metadata to preprocess input images.

@rkdtmdals7710 rkdtmdals7710 added the TFLiteConverter For issues related to TFLite converter label May 7, 2024
@sushreebarsa sushreebarsa added type:support Support issues comp:lite TF Lite related issues labels May 8, 2024
@sushreebarsa
Copy link
Contributor

@rkdtmdals7710 For such issues the recommended solution is to upgrade your protobuf to version 3.19.0 or later. You can usually do this using your package manager (e.g., pip install --upgrade protobuf).
Also please verify the TensorFlow IO Installation,

pip install tensorflow-io
import tensorflow_io as tfio
print(tfio.__version__)


Thank you!

@sushreebarsa sushreebarsa added the stat:awaiting response Status - Awaiting response from author label May 15, 2024
Copy link

This issue is stale because it has been open for 7 days with no activity. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label May 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author TFLiteConverter For issues related to TFLite converter type:support Support issues
Projects
None yet
Development

No branches or pull requests

2 participants