Covers: theory of Entropy
0
Why this is worth your time
It provides clear explaination of entropy and cross entropy and how this loss metrics is used in ML.
How to use this item?

Watch from beginning to end

Fail to play? Open the link directly: https://youtu.be/ErfnhcEV1O8
Author(s) / creator(s) / reference(s)
Aurelien Geron
0 comment
Recipe
publicShare
Star0

NLP Model Evaluation

Contributors
Total time needed: ~18 minutes
Objectives
This list provides basic information about metrics relate to NLP model evaluation.
Potential Use Cases
Understand and build NLP models evaluation
Who is This For ?
BEGINNER
Click on each of the following annotated items to see details.
BOOK_CHAPTER 1. Extrinsic vs. Intrinsic
It provides a good overview and comparison between the two types of model evaluation.
5 minutes
VIDEO 2. Perplexity
It explains the basic concept of perplexity which is an important metrics in evaluating NLP models.
9 minutes
BOOK_CHAPTER 3. Entropy
It provides a definition of entropy.
10 minutes
VIDEO 4. Cross entropy
It provides clear explaination of entropy and cross entropy and how this loss metrics is used in ML.
10 minutes
ARTICLE 5. Relationship between perplexity and entropy
It provides a good explanation of how perplexity and entorpy are related.
10 minutes
ARTICLE 6. Metrics for text generation NLP models evaluation
Overview and codes of common evaluation metrics being used in practical examples of text generation models.
10 minutes
ARTICLE 7. Metrics for text comparison NLP models evaluation
It provides an overview of common evaluation metrics to compare generated text and target text of NLP models.
4 minutes

Concepts Covered

0 comment