Covers: theory of Optimization
Estimated time needed to finish: 21 minutes
Questions this item addresses:
  • How else can I train my neural nets?
How to use this item?

In this video you learn about a few more optimization techniques including the frequently used "back propagation"

Backpropagation

This is method allow us to trace the error the erro throughout the network, so we could correct the neurons that are over or underfitting the datapoints in some way. To understands this we need some algebra concepts first such as Chain Rule.

Chain Rule

The chain rule tell us how to find the derivative of a composite function. You can read more in this article Alt text

Fail to play? Open the link directly: https://youtu.be/zMWNOvS1ySs
Author(s) / creator(s) / reference(s)
Amir Hajian
0 comment
Recipe
publicShare
Star(0)

Deep Learning Model Training And Optimization

Contributors
Total time needed: ~6 hours
Who is This For ?
INTERMEDIATE
Click on each of the following annotated items to see details.
Resources5/6
VIDEO 1. Loss Functions
  • What is the relevance of loss functions for deep learning?
17 minutes
VIDEO 2. Optimization using Gradient Descent - Part 1
  • How do neural networks learn?
26 minutes
VIDEO 3. Optimization using Gradient Descent - Part 2
  • How do you use gradient descent for parameter updating?
17 minutes
VIDEO 4. Chain Rule, Backpropagation & Autograd
  • How else can I train my neural nets?
21 minutes
REPO 5. Hands-on Optimization
  • How to implement optimization methods in PyTorch?
  • How can we update parameters with Gradient Descent?
  • How to implement gradient descent in Pytorch?
30 minutes
RECIPE 6. Understanding BackPropagation
4 hours

Concepts Covered

0 comment