AI-Accelerated Product Development
Total time needed:
See details (learning objective, target audience, etc)...
Learn about Batch normalization concept and math behind it
Potential Use Cases
Batch normalization is a technique for training very deep neural networks that standardizes the inputs to a layer for each mini-batch
Go through the following
1. Normalizing Inputs
How does normalization work?
why do we need to normalize our inputs in a neural network?
2. Intro on mini batch gradient descent (with pseudo code)
What is Mini-Batch Gradient Descent?
How to Configure Mini-Batch Gradient Descent?
3. Mini-batch GD from scratch in Python
How to implement Mini-batch GD in python?
4. Forward propagation in neural networks
What is Forward propagation?
what is the math behind this concept?
5. Forward propagation from scratch in Python
How to implement FP in python?
6. Why Does Batch Norm Work? [no math!]
What is Batch normalization?
Why Does Batch Norm Work?
7. Fitting Batch Norm Into Neural Networks [ more advanced math here! ]
How to fit batch norm into neural network?
8. How to implement Batch Normalization(BN) using Python from scratch
How to implement Batch Normalization In Neural Networks using Python?
9. Batch normalization in Keras
how to implement batch normalization in Keras?
10. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift (OPTIONAL)
Where does this method come from?
Upcoming Live Sessions
Search for a tag:
ML in Marketing
Investment in Emerging Tech
Careers in ML
ML in Economics
ML in Chemistry
ML in Biochemistry and Drug Discovery
IOT and Edge Security
Quantum Tech and Quantum Machine Learning
Science of Science
Generative Adversarial Networks
ML in Neuroscience
ML in Aerospace
ML in Climate Science
ML in Health
ML in Biology
ML for Time Series
ML in Physics
ML in Cybersecurity
Graph Neural Nets
ML Engineering and Ops
ML in Material Science
Natural Language Processing
Math and Foundations