Covers: theory of Optimization

- How else can I train my neural nets?

In this video you learn about a few more optimization techniques including the frequently used "back propagation"

This is method allow us to trace the error the erro throughout the network, so we could correct the neurons that are over or underfitting the datapoints in some way. To understands this we need some algebra concepts first such as Chain Rule.

The chain rule tell us how to find the derivative of a composite function. You can read more in this article

Fail to play? Open the link directly: https://youtu.be/zMWNOvS1ySs

Amir Hajian

0 comment

Contributors

- Who is This For ?
- INTERMEDIATE

Click on each of the following **annotated items** to see details.

Resources5/6

VIDEO 1. Loss Functions

- What is the relevance of loss functions for deep learning?

17 minutes

VIDEO 2. Optimization using Gradient Descent - Part 1

- How do neural networks learn?

26 minutes

VIDEO 3. Optimization using Gradient Descent - Part 2

- How do you use gradient descent for parameter updating?

17 minutes

VIDEO 4. Chain Rule, Backpropagation & Autograd

- How else can I train my neural nets?

21 minutes

REPO 5. Hands-on Optimization

- How to implement optimization methods in PyTorch?
- How can we update parameters with Gradient Descent?
- How to implement gradient descent in Pytorch?

30 minutes

RECIPE 6. Understanding BackPropagation

4 hours

0 comment