Recipe
publicShareStar

Batch normalization

Collaborators
Reviewers
Total time needed: ~3 hours
Learning Objectives
Learn about Batch normalization concept and math behind it
Potential Use Cases
Batch normalization is a technique for training very deep neural networks that standardizes the inputs to a layer for each mini-batch
Target Audience
INTERMEDIATE
Go through the following annotated items in order:
VIDEO 1. Normalizing Inputs
  • How does normalization work?
  • why do we need to normalize our inputs in a neural network?
10 minutes
ARTICLE 2. Intro on mini batch gradient descent (with pseudo code)
  • What is Mini-Batch Gradient Descent?
  • How to Configure Mini-Batch Gradient Descent?
20 minutes
LIBRARY 3. Mini-batch GD from scratch in Python
  • How to implement Mini-batch GD in python?
10 minutes
ARTICLE 4. Forward propagation in neural networks
  • What is Forward propagation?
  • what is the math behind this concept?
20 minutes
LIBRARY 5. Forward propagation from scratch in Python
  • How to implement FP in python?
20 minutes
VIDEO 6. Why Does Batch Norm Work? [no math!]
  • What is Batch normalization?
  • Why Does Batch Norm Work?
15 minutes
VIDEO 7. Fitting Batch Norm Into Neural Networks [ more advanced math here! ]
  • How to fit batch norm into neural network?
13 minutes
LIBRARY 8. How to implement Batch Normalization(BN) using Python from scratch
  • How to implement Batch Normalization In Neural Networks using Python?
20 minutes
LIBRARY 9. Batch normalization in Keras
  • how to implement batch normalization in Keras?
20 minutes
PAPER 10. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift (OPTIONAL)
  • Where does this method come from?
30 minutes

Concepts Covered