Skip to content

Multiple types of NN model optimization environments. It is possible to directly access the host PC GUI and the camera to verify the operation. Intel iHD GPU (iGPU) support. NVIDIA GPU (dGPU) support.

License

Notifications You must be signed in to change notification settings

PINTO0309/mtomo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

mtomo

Multiple types of NN model optimization environments. It is possible to directly access the host PC GUI and the camera to verify the operation. And, Intel iHD GPU (iGPU) support. NVIDIA GPU (dGPU) support.

1. Environment

  1. Docker 20.10.5, build 55c4c88

2. Model optimization environment to be built

  1. Ubuntu 20.04 x86_64
  2. CUDA 11.2
  3. cuDNN 8.1
  4. TensorFlow v2.5.0-rc1 (MediaPipe Custom OP, FlexDelegate, XNNPACK enabled)
  5. tflite_runtime v2.5.0-rc1 (MediaPipe Custom OP, FlexDelegate, XNNPACK enabled)
  6. edgetpu-compiler
  7. flatc 1.12.0
  8. TensorRT cuda11.1-trt7.2.3.4-ga-20210226
  9. PyTorch 1.8.1+cu112
  10. TorchVision 0.9.1+cu112
  11. TorchAudio 0.8.1
  12. OpenVINO 2021.3.394
  13. tensorflowjs
  14. coremltools
  15. onnx
  16. tf2onnx
  17. tensorflow-datasets
  18. openvino2tensorflow
  19. tflite2tensorflow
  20. onnxruntime
  21. onnx-simplifier
  22. MXNet
  23. gdown
  24. OpenCV 4.5.2-openvino
  25. Intel-Media-SDK
  26. Intel iHD GPU (iGPU) support

3. Usage

3-1. Docker Hub

https://hub.docker.com/repository/docker/pinto0309/mtomo/tags?page=1&ordering=last_updated

$ xhost +local: && \
  docker run -it --rm \
    --gpus all \
    -v `pwd`:/home/user/workdir \
    -v /tmp/.X11-unix/:/tmp/.X11-unix:rw \
    --device /dev/video0:/dev/video0:mwr \
    --net=host \
    -e LIBVA_DRIVER_NAME=iHD \
    -e XDG_RUNTIME_DIR=$XDG_RUNTIME_DIR \
    -e DISPLAY=$DISPLAY \
    --privileged \
    pinto0309/mtomo:ubuntu2004_tf2.5.0-rc1_torch1.8.1_openvino2021.3.394

3-2. Docker Build

$ git clone https://github.com/PINTO0309/mtomo.git && cd mtomo
$ docker build -t {IMAGE_NAME}:{TAG} .

3-3. Docker Run

$ xhost +local: && \
  docker run -it --rm \
    --gpus all \
    -v `pwd`:/home/user/workdir \
    -v /tmp/.X11-unix/:/tmp/.X11-unix:rw \
    --device /dev/video0:/dev/video0:mwr \
    --net=host \
    -e LIBVA_DRIVER_NAME=iHD \
    -e XDG_RUNTIME_DIR=$XDG_RUNTIME_DIR \
    -e DISPLAY=$DISPLAY \
    --privileged \
    {IMAGE_NAME}:{TAG}

4. Reference articles

  1. openvino2tensorflow
  2. tflite2tensorflow
  3. tensorflow-onnx (a.k.a tf2onnx)
  4. tensorflowjs
  5. coremltools
  6. OpenVINO
  7. onnx
  8. onnx-simplifier
  9. TensorFLow
  10. PyTorch
  11. flatbuffers (a.k.a flatc)
  12. TensorRT
  13. Intel-Media-SDK/MediaSDK - Running on GPU under docker
  14. Intel-Media-SDK/MediaSDK - Intel media stack on Ubuntu

About

Multiple types of NN model optimization environments. It is possible to directly access the host PC GUI and the camera to verify the operation. Intel iHD GPU (iGPU) support. NVIDIA GPU (dGPU) support.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published