Skip to content

nikvaessen/Rethinking-Binarized-Neural-Network-Optimization

 
 

Repository files navigation


Rethinking Binarized Neural Network Optimization

Conference

Paper

Description

This repository aims to reproduce the results in "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization" as part of the NeurIPS 2019 reproducibility challenge. We have implemented the Binary optimization algorithm in PyTorch and with it are able to train a binary neural network on CIFAR-10. See the reproducibility report for details.

How to run

First, install dependencies

# clone project   
git clone https://github.com/nikvaessen/Rethinking-Binarized-Neural-Network-Optimization

# install project   
cd https://github.com/nikvaessen/Rethinking-Binarized-Neural-Network-Optimization
pip install -e .   
pip install requirements.txt

If you are interested in training a BNN on cifar-10, you can navigate to research_seed/cifar and run cifar_trainer.py.

# module folder
cd research_seed/cifar/   

# run module 
python cifar_trainer.py    

Main Contribution

In order to reproduce the original paper we have implemented the following:

  • Bytorch implements binary optimisation and binary layers in PyTorch
  • cifar implement BinaryNet (from this paper) for CIFAR-10
  • theoretical implements experiments to disprove the approximation viewpoint as well as behaviour of learning rates under latent-weight optimisation
  • experiments contains convenience scripts to reproduce the experiments of section 5.1 and 5.2 of the original paper

About

Reproduction of "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization" for the Reproducibility challenge@NeurIPS19

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 87.7%
  • Python 11.6%
  • Shell 0.7%