Covers: theory of GRU (Gated Recurrent Unit)
Estimated time needed to finish: 20 minutes
Questions this item addresses:
  • What is GRU and what is the math behind this model?
How to use this item?

Watch the whole video

Author(s) / creator(s) / reference(s)
Andrew Ng
0 comment
Recipe
publicShare
Star(0)

Gated Recurrent Unit (Gru)

Contributors
Total time needed: ~2 hours
Objectives
Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks, introduced in 2014 by Kyunghyun Cho et al.
Potential Use Cases
GRU (Gated Recurrent Unit) aims to solve the vanishing gradient problem which comes with a standard recurrent neural network.
Who is This For ?
INTERMEDIATE
Click on each of the following annotated items to see details.
Resources5/7
VIDEO 1. Introduction to RNN
  • What is RNN?
10 minutes
LIBRARY 2. RNN from scratch
  • How to implement RNN in python from scratch?
20 minutes
ARTICLE 3. Recurrent Neural Networks (RNN) - The Vanishing Gradient Problem
  • What is the Vanishing Gradient Problem in RNN architecture?
20 minutes
VIDEO 4. GRU model
  • What is GRU and what is the math behind this model?
20 minutes
ARTICLE 5. GRU - More theory on this model
  • what is GRU model?
20 minutes
LIBRARY 6. Implementing GRU in python
  • How to implement GRU in python?
10 minutes
PAPER 7. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation (Optional- original paper)
  • Where does this model come from?
20 minutes

Concepts Covered

0 comment