Meta-Graph: Few-Shot Link Prediction Using Meta-Learning | AISC
Speaker(s): Avishek (Joey) Bose
Facilitator(s): Nabila Abraham
Find the recording, slides, and more info at https://ai.science/e/meta-graph-few-shot-link-prediction-using-meta-learning--mqol9BHPyj5ZFajdONQ5
Motivation / Abstract
Link prediction is a ubiqtuous task that can be applied to various real world scenarios including biomedical interaction networks, social networks and recommendation systems. The goal of link prediction is to learn from a graph to infer missing or previously unknown relationships. For instance, in a social network we may use link prediction to power a friendship recommendation system, or in the case of biological network data, link prediction might be used to infer possible relationships between drugs and diseases. However, previous work on link prediction generally focuses only on one particular problem setting: previous work generally assumes that link prediction is to be performed on a single dense graph, with at least 50% of the true edges observed during training. Bose and his co-authors investigate how to perform link prediction when only a sparse sample (less than 30%) of edges are available. The authors formulate link prediction as a few-shot learning problem and solve it via a multi-graph, meta-learning strategy. They experiment on 3 very different datasets and find that Meta-Graph has the strongest performance in the sparse data regime, acheiving new state of the art results on sparse graphs.
What was discussed?
- the difference between 'pre-training' and 'fine-tuning' in Meta-Graph
- the motivation for the meta-learning approach
- the rationale for VGAE as the baseline link prediction framework
- the importance of the graph signature function and what it represents
- future directions in scaling Meta-graph to heterogeneous graphs/knowledge graphs
What are the key takeaways?
------
#AISC hosts 3-5 live sessions like this on various AI research, engineering, and product topics every week! Visit https://ai.science for more details