AI-Accelerated Product Development
...
ARTICLE
How Does Attention Work in Encoder-Decoder Recurrent Neural Networks
Star
Covers:
theory
of
alignment
Estimated time needed to finish:
3 minutes
Questions this item addresses:
1- What is alignment?
How is alignment different from translation?
How to use this item?
Read the "Attention Model" section
URL:
https://machinelearningmastery.com/how-does-attention-work-in-encoder-decoder-recurrent-neural-networks/
Author(s) / creator(s) / reference(s)
Jason Brownlee
0
comment
Recipe
public
Share
Star
0
Effective Approaches to Attention-based Neural Machine Translation
Contributors
Total time needed:
~2 hours
Objectives
Local and global attention in machine translation
Potential Use Cases
attention based neural translation
Who is This For ?
INTERMEDIATE
Hide details
Click on each of the following
annotated items
to see details.
ARTICLE
1. How Does Attention Work in Encoder-Decoder Recurrent Neural Networks
1- What is alignment?
How is alignment different from translation?
3 minutes
ARTICLE
2. Global vs. Local Attention
1- What's the difference between global and local attention?
2 minutes
ARTICLE
3. Effective Approaches to Attention-based Neural Machine Translation
1- What is global attention?
2- What are the implementation problems with global attention?
3- What is local attention and how does it differ from local attention?
4- What are practical alignment functions and how do they perform with attention?
60 minutes
Concepts Covered
0
comment