Skip to content
#

batch-normalization

Here are 187 public repositories matching this topic...

This GitHub repository explores the importance of MLP components using the MNIST dataset. Techniques like Dropout, Batch Normalization, and optimization algorithms are experimented with to improve MLP performance. Gain a deeper understanding of MLP components and learn to fine-tune for optimal classification performance on MNIST.

  • Updated Jun 12, 2023
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the batch-normalization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the batch-normalization topic, visit your repo's landing page and select "manage topics."

Learn more