Skip to content
This repository has been archived by the owner on May 1, 2023. It is now read-only.

Loading quantized models with the right state dictionary #546

Open
mueedurrehman opened this issue Dec 8, 2020 · 0 comments
Open

Loading quantized models with the right state dictionary #546

mueedurrehman opened this issue Dec 8, 2020 · 0 comments

Comments

@mueedurrehman
Copy link

I am using distiller's resnet 56 model for CIFAR. After quantization and retraining using QuantAwareTrainRangeLinearQuantizer, what is the procedure for instantiating an instance of that model and loading the correct layers from the state dictionary. The saved state dictionary contains the floating point values for each layer. The model from distiller does not have the additional layers that the quantized model does. Is the procedure to instantiate the same QuantAwareTrainRangeLinearQuantizer quantizer but with train_with_fp_copy=False, call prepare_model on my model and then restore all layers from the checkpoint's state dictionary excluding the floating point version of the weights?

@mueedurrehman mueedurrehman changed the title Loading quantized model's with the right state dictionary Loading quantized models with the right state dictionary Dec 8, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant