Covers: implementation of GRU (Gated Recurrent Unit)
Questions this item addresses:
  • How to implement GRU in python?
How to use this item?

This article explains code for GRU/LSTM/RNN models . Feel free to explore all of them

Author(s) / creator(s) / reference(s)
Chandra Charh
Programming Languages: Python
0 comment

Gated Recurrent Unit (Gru)

Total time needed: ~2 hours
Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks, introduced in 2014 by Kyunghyun Cho et al.
Potential Use Cases
GRU (Gated Recurrent Unit) aims to solve the vanishing gradient problem which comes with a standard recurrent neural network.
Who is This For ?
Click on each of the following annotated items to see details.
VIDEO 1. Introduction to RNN
  • What is RNN?
10 minutes
LIBRARY 2. RNN from scratch
  • How to implement RNN in python from scratch?
20 minutes
ARTICLE 3. Recurrent Neural Networks (RNN) - The Vanishing Gradient Problem
  • What is the Vanishing Gradient Problem in RNN architecture?
20 minutes
VIDEO 4. GRU model
  • What is GRU and what is the math behind this model?
20 minutes
ARTICLE 5. GRU - More theory on this model
  • what is GRU model?
20 minutes
LIBRARY 6. Implementing GRU in python
  • How to implement GRU in python?
10 minutes
PAPER 7. Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation (Optional- original paper)
  • Where does this model come from?
20 minutes

Concepts Covered

0 comment