Covers: theory of Loss Function
Estimated time needed to finish: 17 minutes
Questions this item addresses:
  • What is the relevance of loss functions for deep learning?
How to use this item?

In this video, you will learn about:

  • Non-linear Function: Softmax

Some non-linear functions: softmax Alt text

  • MSE (mean squared error) loss function
  • Cross Entropy loss function (log loss)

Loss Functions define the problem that the Neural Network will solve. So It's of your main interest to understand how does it works and why do we use a certain loss function for an specific network.

Fail to play? Open the link directly: https://youtu.be/p_0nDtAFy60
Author(s) / creator(s) / reference(s)
Amir Hajian
0 comment
Recipe
publicShare
Star(0)

Deep Learning Model Training And Optimization

Contributors
Total time needed: ~6 hours
Who is This For ?
INTERMEDIATE
Click on each of the following annotated items to see details.
Resources5/6
VIDEO 1. Loss Functions
  • What is the relevance of loss functions for deep learning?
17 minutes
VIDEO 2. Optimization using Gradient Descent - Part 1
  • How do neural networks learn?
26 minutes
VIDEO 3. Optimization using Gradient Descent - Part 2
  • How do you use gradient descent for parameter updating?
17 minutes
VIDEO 4. Chain Rule, Backpropagation & Autograd
  • How else can I train my neural nets?
21 minutes
REPO 5. Hands-on Optimization
  • How to implement optimization methods in PyTorch?
  • How can we update parameters with Gradient Descent?
  • How to implement gradient descent in Pytorch?
30 minutes
RECIPE 6. Understanding BackPropagation
4 hours

Concepts Covered

0 comment