1. Grapheme-to-phoneme (G2P) models: convert a written word into its corresponding pronunciation and are essential components in automatic-speech-recognition and text-to-speech systems. Source: https://ieeexplore.ieee.org/document/9054696
  2. Cross-lingual Morphological Inflection (Cross-lingual MI): Cross-lingual transfer between typologically related languages has been proven successful for the task of morphological inflection. Read more about this application here: https://www.aclweb.org/anthology/2020.sigmorphon-1.22/
  3. Machine translation: Translate text or speech from one language to another.
Covers: implementation of Entmax-based sparse sequence-to-sequence models
Estimated time needed to finish: 10 minutes
Questions this item addresses:
  • What are the use cases of smoothing and shrinking the sparse Seq2Seq search space?
0 comment
Recipe
publicShare
Star(0)

Understand the paper: Smoothing and Shrinking the Sparse Seq2Seq Search Space

Contributors
Total time needed: ~2 hours
Objectives
Understand the smoothing and shrinking the sparse Seq2Seq search space paper.
Potential Use Cases
Pronunciation (computer reading text), text to speech and morphological inflection.
Who is This For ?
INTERMEDIATENatural Language Processing (NLP) developers looking to understand how to better sequence-to-sequence models.
Click on each of the following annotated items to see details.
Resource Asset3/10
REPO 1. Background on Neural Machine Translation
  • What is neural machine translation?
  • How does neural machine translation work?
5 minutes
ARTICLE 2. Regularization Techniques
  • What is Regularization?
  • How does Regularization help in reducing Overfitting?
5 minutes
PAPER 3. Sparse Sequence-to-Sequence Models
  • What are sequence-to-sequence models?
  • What are dense attention alignments?
  • What are neural sparse seq2seq models?
10 minutes
WRITEUP 4. Application: Smoothing and Shrinking the Sparse Seq2Seq Search Space
  • What are the use cases of smoothing and shrinking the sparse Seq2Seq search space?
10 minutes
PAPER 5. On NMT Search Errors and Model Errors: Cat Got Your Tongue?
  • What is the "cat got your tongue" problem?
  • Why and where does this problem occur?
5 minutes
PAPER 6. Correcting Length Bias in Neural Machine Translation
  • What are the two main problems in neural machine translation?
  • What is the beam problem and the brevity problem?
15 minutes
WRITEUP 7. Existing solutions for the cat got your tongue problem
  • What are existing solutions for the cat got your tongue problem?
2 minutes
PAPER 8. Six Challenges for Neural Machine Translation
  • What the "beam search curse"?
3 minutes
PAPER 9. Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms
  • What are Fenchel-Young Losses?
6 minutes
PAPER 10. Smoothing and Shrinking the Sparse Seq2Seq Search Space
  • Why are entmax-based seq2seq models better?
8 minutes

Concepts Covered

0 comment