Docker Hub | API documentation | Web tool
ONNX model and Docker service for https://github.com/7eu7d7/ML-Danbooru.
Exporting ML-Danbooru's models from PyTorch to ONNX significantly improves inference times when running on CPU.
docker run --publish $PORT:$PORT --env PORT=$PORT --detach nanoskript/ml-danbooru
All configuration options are optional.
MODEL
- The checkpoint model to be converted into an ONNX model. Currently, onlyml_caformer_m36_fp16_dec-5-97527.ckpt
is supported.IMAGE_SIZE
- The resolution to scale input images to before processing them. Must be a multiple of32
. The default is256
.
Build time arguments can be provided with the --build-arg
flag:
docker run --build-arg IMAGE_SIZE=256 ...
DEFAULT_THRESHOLD
- The default confidence threshold for which results are filtered by. The default is0.5
.MINIMUM_THRESHOLD
- The minimum confidence threshold for filtering that is allowed. Requests with a threshold lower than this will be rejected.
Runtime arguments can be provided with the --env
flag:
docker run --env DEFAULT_THRESHOLD=0.5 ...
- Python 3.10 with the
pdm
package manager
-
Clone this repository:
git clone --recurse-submodules https://github.com/nanoskript/ml-danbooru-onnx-docker.git
-
Install dependencies:
pdm sync -G generate-onnx
-
Run
generate-onnx.py
:pdm run generate-onnx.py
A
model.onnx
file will be created in thevendor
folder.