Covers: theory of Gradient Descent
0
Questions this item addresses:
  • What is mathematical optimization?
  • Why do we need to optimize a cost function in ML algorithms?
How to use this item?

In this first module, we want to learn the basics of mathematical optimization and we want to know why do we need to optimize cost function in ML algorithms.

Author(s) / creator(s) / reference(s)
0 comment
Recipe
publicShare
Star(0)

Rmsprop

Contributors
Total time needed: ~48 minutes
Objectives
Learning about RMSprop concept, math behind it and code it in python
Potential Use Cases
RMSprop is a gradient based optimization technique used in training neural networks
Who is This For ?
INTERMEDIATE
Click on each of the following annotated items to see details.
Resources5/7
ARTICLE 1. Intro to mathematical optimization
  • What is mathematical optimization?
  • Why do we need to optimize a cost function in ML algorithms?
10 minutes
VIDEO 2. Gradient Descent
  • What is Gradient Decent(GD)?
  • How does GD work in python?
10 minutes
LIBRARY 3. Gradient Descent in Python
  • How to implement Gradient Descent in Python?
20 minutes
VIDEO 4. RMSprop
  • What is RMSprop?
  • How does this algorithm work?
8 minutes
ARTICLE 5. RMSprop: Divide the gradient by a running average of its recent magnitude
  • Why rprop does not work with mini-batches
  • Further developments of rmsprop
10 minutes
LIBRARY 6. RMSprop from scratch in Python
  • how to code RMSprop in python from scratch?
10 minutes
LIBRARY 7. Implement RMSprop in Keras
  • How to implement RMSprop in Keras?
20 minutes

Concepts Covered

0 comment