Structured Neural Summarization | AISC Lunch & Learn
Subscribers:
22,300
Published on ● Video Link: https://www.youtube.com/watch?v=K7eXuDsVe94
For more details including paper and slides, visit
https://aisc.a-i.science/events/2019-04-16/
Abstract
Summarization of long sequences into a concise statement is a core problem in natural language processing, requiring non-trivial understanding of the input. Based on the promising results of graph neural networks on highly structured data, we develop a framework to extend existing sequence encoders with a graph component that can reason about long-distance relationships in weakly structured data such as text. In an extensive evaluation, we show that the resulting hybrid sequence-graph models outperform both pure sequence models as well as pure graph models on a range of summarization tasks.
Other Videos By LLMs Explained - Aggregate Intellect - AI.SCIENCE
Tags:
deep learning
machine learning
learning representation
nlp
natural language processing
natural language summarization
graph neural network
gcn
ggnn
gated graph neural networks
hybrid graph sequence encoder
abstractive text summarization
method naming
seq2seq