Covers: implementation of Optimization

- How to implement optimization methods in PyTorch?
- How can we update parameters with Gradient Descent?
- How to implement gradient descent in Pytorch?

This sub-repo has 2 Notebooks you can go through to practice the concepts in hands with Python. Also, you can see the answers directly in Solution's File. The recommended use of this is:

```
1.- Take your time to solve the Notebooks by yourself
2.- Contrast your notebook with their respective Notebook’s in ./Solutions File.
```

There are questions inside Quiz.md where you can evaluate yourself and explore more topics that are related with the recipe’s content.

In this notebook, you will learn about

- Gradient Descent Algorithm
- Derivatives, Chain Rule, Backpropagation
- Autograd

Amir Hajian

0 comment

Contributors

- Who is This For ?
- INTERMEDIATE

Click on each of the following **annotated items** to see details.

Resources5/6

VIDEO 1. Loss Functions

- What is the relevance of loss functions for deep learning?

17 minutes

VIDEO 2. Optimization using Gradient Descent - Part 1

- How do neural networks learn?

26 minutes

VIDEO 3. Optimization using Gradient Descent - Part 2

- How do you use gradient descent for parameter updating?

17 minutes

VIDEO 4. Chain Rule, Backpropagation & Autograd

- How else can I train my neural nets?

21 minutes

REPO 5. Hands-on Optimization

- How to implement optimization methods in PyTorch?
- How can we update parameters with Gradient Descent?
- How to implement gradient descent in Pytorch?

30 minutes

RECIPE 6. Understanding BackPropagation

4 hours

0 comment