PAPER‘Less Than One’-Shot Learning: Learning N Classes From M<N Samples

Covers: theory of 'Less Than One'-Shot Learning
Estimated time needed: 10 minutes
Questions this item adddesses:
  • How can we learn from a dataset that has less classes than what we're hoping to classify?
How to use this item?

Read the last 2 sections of the paper

Author(s) / creator(s) / reference(s)
Ilia Sucholutsky, Matthias Schonlau
Shortlist
publicShare

Learning N Classes From M<N Samples

Nour FahmyTotal time needed: ~28 minutes
Learning Objectives
Learn about developments in deep learning research for few-shot learning
Potential Use Cases
In the scenario you too have less than optimal sample sizes!
Target Audience
ADVANCEDIf you're already familiar and have worked with NNs in the past, and looking to navigate suboptimal sample sizes
Go through the following annotated items in order:
PAPER 1. Dataset Distillation
  • Is it possible to train a model on synthetic data out of the manifolds of the original data?
  • How much data is encoded in a given training set and how compressible it is?
18 minutes
PAPER 2. Soft-Label Dataset Distillation and Text Dataset Distillation
  • How can we train a model using data distillation and relaxing the hard labels to a probability distribution?
10 minutes
PAPER 3. ‘Less Than One’-Shot Learning: Learning N Classes From M<N Samples
  • How can we learn from a dataset that has less classes than what we're hoping to classify?
10 minutes

Concepts Convered