AI-Accelerated Product Development
...
ARTICLE
Effective Approaches to Attention-based Neural Machine Translation
Star
Covers:
theory
of
neural machine translation
Estimated time needed to finish:
60 minutes
Questions this item addresses:
1- What is global attention?
2- What are the implementation problems with global attention?
3- What is local attention and how does it differ from local attention?
4- What are practical alignment functions and how do they perform with attention?
How to use this item?
Read the entire paper
URL:
https://arxiv.org/pdf/1508.04025.pdf
Author(s) / creator(s) / reference(s)
Luong et.al
0
comment
Recipe
public
Share
Star
Effective Approaches to Attention-based Neural Machine Translation
Collaborators
Total time needed:
~2 hours
Objectives
Local and global attention in machine translation
Potential Use Cases
attention based neural translation
Who is this for ?
INTERMEDIATE
Hide details
Click on each of the following
annotated items
to see details.
ARTICLE
1. How Does Attention Work in Encoder-Decoder Recurrent Neural Networks
1- What is alignment?
How is alignment different from translation?
3 minutes
ARTICLE
2. Global vs. Local Attention
1- What's the difference between global and local attention?
2 minutes
ARTICLE
3. Effective Approaches to Attention-based Neural Machine Translation
1- What is global attention?
2- What are the implementation problems with global attention?
3- What is local attention and how does it differ from local attention?
4- What are practical alignment functions and how do they perform with attention?
60 minutes
Concepts Covered
0
comment