Skip to content
#

optimizers

Here are 68 public repositories matching this topic...

This GitHub repository contains the code used for CS-671: Introduction to Deep Learning course offered by IIT Mandi during the Even Semester of 2022. The repository includes the implementations of various deep learning algorithms and techniques covered in the course.

  • Updated Jul 16, 2023
  • Jupyter Notebook

🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠

  • Updated Mar 16, 2024
  • Jupyter Notebook

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

  • Updated Aug 15, 2022
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the optimizers topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the optimizers topic, visit your repo's landing page and select "manage topics."

Learn more