Covers: implementation of Gradient Descent
Estimated time needed to finish: 20 minutes
Questions this item addresses:
  • How to implement Gradient Descent in Python?
How to use this item?

In this last module of prerequisite concepts of RMSprop, this code is going to help us understand better how GD works in the python language.

Author(s) / creator(s) / reference(s)
Programming Languages: Python
0 comment


Total time needed: ~48 minutes
Learning about RMSprop concept, math behind it and code it in python
Potential Use Cases
RMSprop is a gradient based optimization technique used in training neural networks
Who is This For ?
Click on each of the following annotated items to see details.
ARTICLE 1. Intro to mathematical optimization
  • What is mathematical optimization?
  • Why do we need to optimize a cost function in ML algorithms?
10 minutes
VIDEO 2. Gradient Descent
  • What is Gradient Decent(GD)?
  • How does GD work in python?
10 minutes
LIBRARY 3. Gradient Descent in Python
  • How to implement Gradient Descent in Python?
20 minutes
VIDEO 4. RMSprop
  • What is RMSprop?
  • How does this algorithm work?
8 minutes
ARTICLE 5. RMSprop: Divide the gradient by a running average of its recent magnitude
  • Why rprop does not work with mini-batches
  • Further developments of rmsprop
10 minutes
LIBRARY 6. RMSprop from scratch in Python
  • how to code RMSprop in python from scratch?
10 minutes
LIBRARY 7. Implement RMSprop in Keras
  • How to implement RMSprop in Keras?
20 minutes

Concepts Covered

0 comment