Post-training static quantization using ResNet18 architecture
-
Updated
Aug 1, 2020 - Jupyter Notebook
Post-training static quantization using ResNet18 architecture
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、reg…
Quantization for Object Detection in Tensorflow 2.x
The repository discusses a research work published on MDPI Sensors and provides details about the project.
Pytorch implementation of our paper accepted by ECCV 2022-- Fine-grained Data Distribution Alignment for Post-Training Quantization
This sample shows how to convert TensorFlow model to OpenVINO IR model and how to quantize OpenVINO model.
This repository contains notebooks that show the usage of TensorFlow Lite for quantizing deep neural networks.
Model Quantization with Pytorch, Tensorflow & Larq
[IJCAI 2022] FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
Low-bit (2/4/8/16) Post Training Quantization for ResNet20
Generating tensorrt model using onnx
A framework to train a ResUNet architecture, quantize, compile and execute it on an FPGA.
quantization example for pqt & qat
Post-training quantization on Nvidia Nemo ASR model
Improved the performance of 8-bit PTQ4DM expecially on FID.
Implementation of EPTQ - an Enhanced Post-Training Quantization algorithm for DNN compression
Comprehensive study on the quantization of various CNN models, employing techniques such as Post-Training Quantization and Quantization Aware Training (QAT).
Notes on quantization in neural networks
A model compression and acceleration toolbox based on pytorch.
Add a description, image, and links to the post-training-quantization topic page so that developers can more easily learn about it.
To associate your repository with the post-training-quantization topic, visit your repo's landing page and select "manage topics."