AI-Accelerated Product Development
Total time needed:
See details (learning objective, target audience, etc)...
Learn about Batch normalization concept and math behind it
Potential Use Cases
Batch normalization is a technique for training very deep neural networks that standardizes the inputs to a layer for each mini-batch
Go through the following
1. Normalizing Inputs
How does normalization work?
why do we need to normalize our inputs in a neural network?
2. Intro on mini batch gradient descent (with pseudo code)
What is Mini-Batch Gradient Descent?
How to Configure Mini-Batch Gradient Descent?
3. Mini-batch GD from scratch in Python
How to implement Mini-batch GD in python?
4. Forward propagation in neural networks
What is Forward propagation?
what is the math behind this concept?
5. Forward propagation from scratch in Python
How to implement FP in python?
6. Why Does Batch Norm Work? [no math!]
What is Batch normalization?
Why Does Batch Norm Work?
7. Fitting Batch Norm Into Neural Networks [ more advanced math here! ]
How to fit batch norm into neural network?
8. How to implement Batch Normalization(BN) using Python from scratch
How to implement Batch Normalization In Neural Networks using Python?
9. Batch normalization in Keras
how to implement batch normalization in Keras?
10. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift (OPTIONAL)
Where does this method come from?
Upcoming Live Sessions
Search for a tag:
Investment in Emerging Tech
ML in Neuroscience
Natural Language Processing
IOT and Edge Security
ML in Climate Science
ML in Material Science
ML for Time Series
Careers in ML
Science of Science
ML in Economics
ML in Physics
Generative Adversarial Networks
ML in Marketing
Quantum Tech and Quantum Machine Learning
ML in Chemistry
ML in Biochemistry and Drug Discovery
ML Engineering and Ops
Graph Neural Nets
Math and Foundations
ML in Cybersecurity
ML in Biology
ML in Health
ML in Aerospace