Covers: implementation of Count Based Methods
Questions this item addresses:
  • What is Latent Semantic Analysis (LSA)? Why LSA? Implementation of LSA?
How to use this item?

Read the whole article to understand what is LSA and how can we implement LSA using Python

Author(s) / creator(s) / reference(s)
Sanket Doshi
0 comment

Word Embeddings

Total time needed: ~55 minutes
With this list, you will master the art of Embeddings from basic Count based embedding to state of the art BERT embeddings
Potential Use Cases
Word Embeddings are used everywhere - text classification, language modelling, transfer learning, etc
Who is This For ?
BEGINNER Data Scientist, Data Analyst
Click on each of the following annotated items to see details.
ARTICLE 1. What Are Word Embeddings?
  • What are word embeddings for Text? Some Commonly used Word Embedding Algorithms?
10 minutes
ARTICLE 2. The Distributional Hypothesis: semantic models in theory and practice
  • What is Distributional Hypothesis?
5 minutes
ARTICLE 3. An Intuitive Understanding of Word Embeddings: From Count Vectors to Word2Vec
  • What are Count based Embedding Methods? What is Count Vector? What is TF-IDF? What is Co-occurrence Matrix Method for embeddings?
10 minutes
ARTICLE 4. Understanding Pointwise Mutual Information in NLP
  • What is Positive Pointwise Mutual Information?
10 minutes
ARTICLE 5. Latent Semantic Analysis — Deduce the hidden topic from the document
  • What is Latent Semantic Analysis (LSA)? Why LSA? Implementation of LSA?
10 minutes
ARTICLE 6. The Illustrated Word2Vec
  • What is Language Modeling and How to train it? What is SkipGram? What is Negative Sampling? Word2Vec Training process?
15 minutes
ARTICLE 7. Getting started with NLP: Word Embeddings, GloVe and Text classification
  • What is Glove? How to load pre-trained Glove word embedding? How to visualize word embeddings? How to train a simple Text classifier using Glove embeddings?
15 minutes
ARTICLE 8. BERT Word Embeddings Tutorial
  • What is BERT? Why BERT Embeddings? Extracting Embeddings? Running BERT on our text? Understanding the output of BERT?
20 minutes

Concepts Covered

0 comment