Covers: theory of Entropy
Why this is worth your time
It provides a definition of entropy.
How to use this item?

Read section 3.7 page 20 to 21

Author(s) / creator(s) / reference(s)
Dan Jurafsky
0 comment

NLP Model Evaluation

Total time needed: ~18 minutes
This list provides basic information about metrics relate to NLP model evaluation.
Potential Use Cases
Understand and build NLP models evaluation
Who is This For ?
Click on each of the following annotated items to see details.
BOOK_CHAPTER 1. Extrinsic vs. Intrinsic
It provides a good overview and comparison between the two types of model evaluation.
5 minutes
VIDEO 2. Perplexity
It explains the basic concept of perplexity which is an important metrics in evaluating NLP models.
9 minutes
It provides a definition of entropy.
10 minutes
VIDEO 4. Cross entropy
It provides clear explaination of entropy and cross entropy and how this loss metrics is used in ML.
10 minutes
ARTICLE 5. Relationship between perplexity and entropy
It provides a good explanation of how perplexity and entorpy are related.
10 minutes
ARTICLE 6. Metrics for text generation NLP models evaluation
Overview and codes of common evaluation metrics being used in practical examples of text generation models.
10 minutes
ARTICLE 7. Metrics for text comparison NLP models evaluation
It provides an overview of common evaluation metrics to compare generated text and target text of NLP models.
4 minutes

Concepts Covered

0 comment