Contributors

Covers: theory of

- Why to initialize parameters for DNN?

Go through all the video and takes notes on how important is to initialize weights on Deep Neural Networks.

Fail to play? Open the link directly: https://youtu.be/s2coXdufOzE

Andrew Ng

test comment

Question reply

Coming soon

public

Contributors

- Objectives
- With this recipe you will understand how to train a neural network
- Potential Use Cases
- Build your own Neural Network
- Who is This For ?
- BEGINNER

Click on each of the following **annotated items** to see details.

Resources

VIDEO 1. Loss Functions

- What is the relevance of loss functions for deep learning?

17 minutes

VIDEO 2. Optimization using Gradient Descent - Part 1

- How do neural networks learn?

26 minutes

VIDEO 3. Optimization using Gradient Descent - Part 2

- How do you use gradient descent for parameter updating?

17 minutes

VIDEO 4. Chain Rule, Backpropagation & Autograd

- How else can I train my neural nets?

21 minutes

REPO 5. Hands-on Optimization

- How to implement optimization methods in PyTorch?
- How can we update parameters with Gradient Descent?
- How to implement gradient descent in Pytorch?

30 minutes

RECIPE 6. Understanding BackPropagation

4 hours

VIDEO 7. Learning Rate Decay

- Why to slowly reduce your learning rate?

6 minutes

VIDEO 8. Weights Initialization

- Why to initialize parameters for DNN?

6 minutes

0 comment