Covers: implementation of Optimization
Estimated time needed to finish: 30 minutes
Questions this item addresses:
  • How to implement optimization methods in PyTorch?
  • How can we update parameters with Gradient Descent?
  • How to implement gradient descent in Pytorch?
How to use this item?

This sub-repo has 2 Notebooks you can go through to practice the concepts in hands with Python. Also, you can see the answers directly in Solution's File. The recommended use of this is:

1.- Take your time to solve the Notebooks by yourself
2.- Contrast your notebook with their respective Notebook’s in ./Solutions File.

Bonus!

There are questions inside Quiz.md where you can evaluate yourself and explore more topics that are related with the recipe’s content.

In this notebook, you will learn about

  • Gradient Descent Algorithm
  • Derivatives, Chain Rule, Backpropagation
  • Autograd
Author(s) / creator(s) / reference(s)
Amir Hajian
Programming Languages: Python
0 comment
Recipe
publicShare
Star(0)

Deep Learning Model Training And Optimization

Contributors
Total time needed: ~6 hours
Who is This For ?
INTERMEDIATE
Click on each of the following annotated items to see details.
Resources5/6
VIDEO 1. Loss Functions
  • What is the relevance of loss functions for deep learning?
17 minutes
VIDEO 2. Optimization using Gradient Descent - Part 1
  • How do neural networks learn?
26 minutes
VIDEO 3. Optimization using Gradient Descent - Part 2
  • How do you use gradient descent for parameter updating?
17 minutes
VIDEO 4. Chain Rule, Backpropagation & Autograd
  • How else can I train my neural nets?
21 minutes
REPO 5. Hands-on Optimization
  • How to implement optimization methods in PyTorch?
  • How can we update parameters with Gradient Descent?
  • How to implement gradient descent in Pytorch?
30 minutes
RECIPE 6. Understanding BackPropagation
4 hours

Concepts Covered

0 comment