Past Recording
ShareStar
'Less Than One'-Shot Learning
Tuesday Feb 23 2021 17:00 GMT
Please to join the live chat.
'Less Than One'-Shot Learning
Why This Is Interesting

Deep neural networks require large training sets but suffer from high computational cost and long training times. Training on much smaller training sets while maintaining nearly the same accuracy would be very beneficial. In the few-shot learning setting, a model must learn a new class given only a small number of samples from that class. One-shot learning is an extreme form of few-shot learning where the model must learn a new class from a single example.

Discussion Points
  • How can we learn from a dataset that has less classes than what we’re hoping to classify?
  • Is it possible to train a model on synthetic data out of the manifolds of the original data?
  • How much data is encoded in a given training set and how compressible it is?
Time of Recording: Tuesday Feb 23 2021 17:00 GMT