[MEM] Learning Permutation Invariant Representations using Memory Networks | AISC
Speaker: Shivam Kalra
Host: Mahdi Biparva
Find the recording, slides, and more info at https://ai.science/e/mem-learning-permutation-invariant-representations-using-memory-networks--WzJoy0DKJD30IclF8mkD
Motivation / Abstract
Many real world tasks such as classification of digital histopathological images and 3D object detection involve learning from a set of instances. In these cases, only a group of instances or a set, collectively,contains meaningful information and therefore only the sets have labels,and not individual data instances. In this work, we present a permutation invariant neural network called Memory-based Exchangeable Model(MEM)for learning universal set functions. The MEM model consists of memory units which embed an input sequence to high-level features enabling it to learn inter-dependencies among instances through a self-attention mechanism. We evaluated the learning ability of MEM on various toy datasets, point cloud classification, and classification of whole slide images (WSIs) into two subtypes of lung cancer—Lung Adenocarcinoma, and Lung Squamous Cell Carcinoma.
What was discussed?
- multi-instance learning
- permutation-invariant inputs to neural networks
- set-based vs instance-based approaches for vision problems
- application of the novel approach in medical imaging
------
#AISC hosts 3-5 live sessions like this on various AI research, engineering, and product topics every week! Visit https://ai.science for more details