Skip to content

sayakpaul/BERT-for-Mobile

Repository files navigation

BERT-for-Mobile

Compares the DistilBERT and MobileBERT architectures for mobile deployments. It is a part of the End-To-End TFLite Tutorials project. Here's blog post (TODO: link) that presents this comparison in a comprehensive manner.

The notebooks included in this repository show how to use the DistilBERT model with the SST-2 dataset for text classification. The notebooks also include the code for TensorFlow Lite model conversion and evaluation. In the future, I also plan to cover question-answering with DistilBERT.

Note that this repository does not contain the model training and model conversion code for MobileBERT since TensorFlow Lite has got a pretty straightforward guide already available. A similar guide is also available here for question-answering.

TensorFlow Lite model files and fine-tuned weights

Benchmarks

Acknowledgements

Thanks to the ML-GDE program for providing GCP credits that were used in order to spin up an AI Platform Notebook. Thanks to Khanh LeViet for his guidance.