Covers: theory of Gated Recurrent Unit(GRU)

- What is GRU and what is the math behind this model?

Watch the whole video

Andrew Ng

0 comment

Collaborators

- Objectives
- Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks, introduced in 2014 by Kyunghyun Cho et al.
- Potential Use Cases
- GRU (Gated Recurrent Unit) aims to solve the vanishing gradient problem which comes with a standard recurrent neural network.
- Who is this for ?
- INTERMEDIATE

Click on each of the following **annotated items** to see details.

VIDEO 1. Introduction to RNN

- What is RNN?

10 minutes

LIBRARY 2. RNN from scratch

- How to implement RNN in python from scratch?

20 minutes

ARTICLE 3. Recurrent Neural Networks (RNN) - The Vanishing Gradient Problem

- What is the Vanishing Gradient Problem in RNN architecture?

20 minutes

VIDEO 4. GRU model

- What is GRU and what is the math behind this model?

20 minutes

ARTICLE 5. GRU - More theory on this model

- what is GRU model?

20 minutes

LIBRARY 6. Implementing GRU in python

- How to implement GRU in python?

10 minutes

PAPER 7. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation (Optional- original paper)

- Where does this model come from?

20 minutes

0 comment