Covers: theory of Optimization

- What is the most common technique to travel through neural networks and how do we update our weights of our NN?

Read sections 1 through 4

Hardik Sahi

0 comment

Contributors

- Objectives
- Learn the fundamental structures of deep learning and the mathematical functions that help them implement deep learning.
- Potential Use Cases
- If you're a beginner looking to figure out where to begin, or a trained vet hoping to review some basics.
- Who is This For ?
- BEGINNERIf you're a beginner looking to figure out where to begin, or a trained vet hoping to review some basics.

Click on each of the following **annotated items** to see details.

Resources5/6

OTHER 1. Hidden layers

- What are the layers in a neural network and what do they do?

50 minutes

OTHER 2. Activation functions

- What is an activation function and why is it important in the context of NNs?

20 minutes

ARTICLE 3. Backpropagation

- What is the most common technique to travel through neural networks and how do we update our weights of our NN?

60 minutes

OTHER 4. Gradient Descent

- What is the most common optimization technique?

30 minutes

ARTICLE 5. L1 Regularization

- How do we prevent overfitting of our weights?

10 minutes

ARTICLE 6. L2 Regularization

- How do we prevent overfitting of our weights?

10 minutes

0 comment