- Learning Objectives
- With this shortlist you will understand how the self-attention mechanism can relate different positions of a single sequence in order to compute a representation of the sequence.
- Potential Use Cases
- Help Google better discern the context of words in search queries.
- Target Audience
- BEGINNERBeginners looking to understand the Attention Is All You Need research paper.