Covers: theory of

- Why to slowly reduce your learning rate?

Why is this important?

Play with learning rates during training could help to accelerate all the training process.

Fail to play? Open the link directly: https://youtu.be/QzulmoOg2JE

Andrew NG

test comment

Question reply

Contributors

- Objectives
- With this recipe you will understand how to train a neural network
- Potential Use Cases
- Build your own Neural Network
- Who is This For ?
- BEGINNER

Click on each of the following **annotated items** to see details.

Resources5/8

VIDEO 1. Loss Functions

- What is the relevance of loss functions for deep learning?

17 minutes

VIDEO 2. Optimization using Gradient Descent - Part 1

- How do neural networks learn?

26 minutes

VIDEO 3. Optimization using Gradient Descent - Part 2

- How do you use gradient descent for parameter updating?

17 minutes

VIDEO 4. Chain Rule, Backpropagation & Autograd

- How else can I train my neural nets?

21 minutes

REPO 5. Hands-on Optimization

- How to implement optimization methods in PyTorch?
- How can we update parameters with Gradient Descent?
- How to implement gradient descent in Pytorch?

30 minutes

RECIPE 6. Understanding BackPropagation

4 hours

VIDEO 7. Learning Rate Decay

- Why to slowly reduce your learning rate?

6 minutes

VIDEO 8. Weights Initialization

- Why to initialize parameters for DNN?

6 minutes

0 comment