AI-Accelerated Product Development
Total time needed:
See details (learning objective, target audience, etc)...
Learn about Batch normalization concept and math behind it
Potential Use Cases
Batch normalization is a technique for training very deep neural networks that standardizes the inputs to a layer for each mini-batch
Go through the following
1. Normalizing Inputs
How does normalization work?
why do we need to normalize our inputs in a neural network?
2. Intro on mini batch gradient descent (with pseudo code)
What is Mini-Batch Gradient Descent?
How to Configure Mini-Batch Gradient Descent?
3. Mini-batch GD from scratch in Python
How to implement Mini-batch GD in python?
4. Forward propagation in neural networks
What is Forward propagation?
what is the math behind this concept?
5. Forward propagation from scratch in Python
How to implement FP in python?
6. Why Does Batch Norm Work? [no math!]
What is Batch normalization?
Why Does Batch Norm Work?
7. Fitting Batch Norm Into Neural Networks [ more advanced math here! ]
How to fit batch norm into neural network?
8. How to implement Batch Normalization(BN) using Python from scratch
How to implement Batch Normalization In Neural Networks using Python?
9. Batch normalization in Keras
how to implement batch normalization in Keras?
10. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift (OPTIONAL)
Where does this method come from?
Upcoming Live Sessions
Search for a tag:
ML in Health
ML in Biochemistry and Drug Discovery
Science of Science
Math and Foundations
ML in Material Science
Investment in Emerging Tech
ML Engineering and Ops
ML in Neuroscience
Graph Neural Nets
Natural Language Processing
IOT and Edge Security
ML in Biology
ML in Marketing
Quantum Tech and Quantum Machine Learning
ML in Chemistry
Generative Adversarial Networks
ML in Economics
ML in Climate Science
ML in Cybersecurity
Careers in ML
ML in Aerospace
ML for Time Series
ML in Physics