AI-Accelerated Product Development
Total time needed:
Learn the theory behind AdaGrad as an optimizer and how to implement it in Python
Potential Use Cases
Adagrad is an algorithm for gradient-based optimization. it is well-suited when dealing with sparse data (NLP or image recognition).
Who is This For ?
Click on each of the following
to see details.
1. Intro to mathematical optimization
What is mathematical optimization?
Why do we need to optimize a cost function in ML algorithms?
2. Gradient Descent
What is Gradient Decent(GD)?
How does GD work in python?
3. Learning Rate
What is learning rate?
How can I make it better?
4. AdaGrad : Introduction (No math!)
What is Adagrad?
5. Adaptive Gradient (adaGrad) : Introduction [ With more advanced math concepts ]
What is AdaGrad?
What is the math behind this optimizer?
6. AdaGrad in Python
How to implement AdaGrad in Python
7. Adaptive Subgradient Methods for Online Learning and Stochastic Optimization (optional)
Where does this optimizer come from?