AI-Accelerated Product Development
Overview of Attention: Concept & Tool Deep Learning
Total time needed:
This Shortlist will cover what attention— a popular concept and a useful tool in deep learning—is. It will cover: Seq2Seq Models, Attention Mechanisms, Neural Turing Machines, and Transformers.
Potential Use Cases
Translation, Transformers, Generative Adversarial Network (GAN)
Who is This For ?
Deep learning developers interested in NLP
Click on each of the following
to see details.
1. Attention? Attention!
What are attention mechanisms?
How was attention invented?
What are various attention mechanisms and models?
What’s wrong with Seq2Seq model?
What are neural turing machines?
What is a Pointer Network?
How can you build seq2seq models without recurrent network units?
What is a Self-Attention GAN?
2. DeepMind x UCL | Deep Learning Lectures | 8/12 | Attention and Memory in Deep Learning
What are some contemporary attention mechanisms?
What is the implicit attention present in any deep network?
What are discrete and differentiable variants of explicit attention?
How do networks with external memory work and how can attention provide them with selective recall?
3. Attention Mechanism in Neural Networks
What's the difference between global and local attention?
Why is local attention also called window-based attention?