Creator
Covers: implementation of Optimization
Estimated time needed to finish: 17 minutes
Questions this item addresses:
  • How do you use gradient descent for parameter updating?
How to use this item?

In this video you will implement gradient descent and get more familiar with the concept of derivatives and walk through a training example.

Fail to play? Open the link directly: https://youtu.be/x7F7zZd23PU
Author(s) / creator(s) / reference(s)
Amir Hajian
AMMAR KHAN.
test comment
AMMAR KHAN.
Question reply
Recipe
publicShare
Star(0)

Deep Learning Model Training And Optimization

Contributors
Total time needed: ~6 hours
Objectives
With this recipe you will understand how to train a neural network
Potential Use Cases
Build your own Neural Network
Who is This For ?
BEGINNER
Click on each of the following annotated items to see details.
Resources5/8
VIDEO 1. Loss Functions
  • What is the relevance of loss functions for deep learning?
17 minutes
VIDEO 2. Optimization using Gradient Descent - Part 1
  • How do neural networks learn?
26 minutes
VIDEO 3. Optimization using Gradient Descent - Part 2
  • How do you use gradient descent for parameter updating?
17 minutes
VIDEO 4. Chain Rule, Backpropagation & Autograd
  • How else can I train my neural nets?
21 minutes
REPO 5. Hands-on Optimization
  • How to implement optimization methods in PyTorch?
  • How can we update parameters with Gradient Descent?
  • How to implement gradient descent in Pytorch?
30 minutes
RECIPE 6. Understanding BackPropagation
4 hours
VIDEO 7. Learning Rate Decay
  • Why to slowly reduce your learning rate?
6 minutes
VIDEO 8. Weights Initialization
  • Why to initialize parameters for DNN?
6 minutes

Concepts Covered

0 comment