AI-Accelerated Product Development
Total time needed:
With this list, you will master the art of Embeddings from basic Count based embedding to state of the art BERT embeddings
Potential Use Cases
Word Embeddings are used everywhere - text classification, language modelling, transfer learning, etc
Who is This For ?
Data Scientist, Data Analyst
Click on each of the following
to see details.
1. What Are Word Embeddings?
What are word embeddings for Text? Some Commonly used Word Embedding Algorithms?
2. The Distributional Hypothesis: semantic models in theory and practice
What is Distributional Hypothesis?
3. An Intuitive Understanding of Word Embeddings: From Count Vectors to Word2Vec
What are Count based Embedding Methods? What is Count Vector? What is TF-IDF? What is Co-occurrence Matrix Method for embeddings?
4. Understanding Pointwise Mutual Information in NLP
What is Positive Pointwise Mutual Information?
5. Latent Semantic Analysis — Deduce the hidden topic from the document
What is Latent Semantic Analysis (LSA)? Why LSA? Implementation of LSA?
6. The Illustrated Word2Vec
What is Language Modeling and How to train it? What is SkipGram? What is Negative Sampling? Word2Vec Training process?
7. Getting started with NLP: Word Embeddings, GloVe and Text classification
What is Glove? How to load pre-trained Glove word embedding? How to visualize word embeddings? How to train a simple Text classifier using Glove embeddings?
8. BERT Word Embeddings Tutorial
What is BERT? Why BERT Embeddings? Extracting Embeddings? Running BERT on our text? Understanding the output of BERT?