Past Recording
ShareSave
Learning Permutation Invariant Representations using Memory Networks
Wednesday Aug 26 2020 16:00 GMT
Please to join the live chat.
Learning Permutation Invariant Representations using Memory Networks
Why This Is Interesting

Many real world tasks such as classification of digital histopathological images and 3D object detection involve learning from a set of instances. In these cases, only a group of instances or a set, collectively,contains meaningful information and therefore only the sets have labels,and not individual data instances. In this work, we present a permutation invariant neural network called Memory-based Exchangeable Model(MEM)for learning universal set functions. The MEM model consists of memory units which embed an input sequence to high-level features enabling it to learn inter-dependencies among instances through a self-attention mechanism. We evaluated the learning ability of MEM on various toy datasets, point cloud classification, and classification of whole slide images (WSIs) into two subtypes of lung cancer—Lung Adenocarcinoma, and Lung Squamous Cell Carcinoma.

Discussion Points
  • multi-instance learning
  • permutation-invariant inputs to neural networks
  • set-based vs instance-based approaches for vision problems
  • application of the novel approach in medical imaging
Time of Recording: Wednesday Aug 26 2020 16:00 GMT