Representation Learning of Histopathology Images using Graph Neural Networks | AISC
Find the recording, slides, and more info at https://ai.science/e/representation-learning-of-histopathology-images-using-graph-neural-networks--dKvmB7GkoU9tPtFyd0fH
Speaker: Mohammed Adnan; Discussion Facilitator: Nabila Abraham
Motivation / Abstract
Representation learning for Whole Slide Images (WSIs) is pivotal in developing image-based systems to achieve higher precision in diagnostic pathology. The authors propose a two-stage framework for WSI representation learning. They use graph neural networks to learn relations among sampled representative patches to aggregate the image information into a single vector representation. They also introduce attention via graph pooling to automatically infer patches with higher relevance. The authors experiment on 1,026 lung cancer WSIs with the 40× magnification from The Cancer Genome Atlas (TCGA) dataset, the largest public repository of histopathology images and achieved state-of-the-art accuracy of 88.8% and AUC of 0.89 on lung cancer sub-type classification by extracting features from a pre-trained DenseNet model. The authors will be presenting this work at CVPR 2020!
What was discussed?
- Modelling multiple instance learning as a graph problem
- Graph pooling with attention
- Adjacency matrix learning
What are the key takeaways?
------
#AISC hosts 3-5 live sessions like this on various AI research, engineering, and product topics every week! Visit https://ai.science for more details