[ELMo] Deep Contextualized Word Representations | AISC

Published on ● Video Link: https://www.youtube.com/watch?v=9JfGxKkmBc0



Duration: 1:22:15
14,588 views
213


Toronto Deep Learning Series, 4 June 2018

For slides and more information, visit https://aisc.ai.science/events/2018-06-04/


Paper Review: https://arxiv.org/abs/1802.05365

Speaker: https://www.linkedin.com/in/chris-laver-81643a3b/
Organizer: https://www.linkedin.com/in/amirfz/

Host: http://www.rbc.com/futuremakers/

Paper abstract:
"We introduce a new type of deep contextualized word representation that models both (1) complex characteristics of word use (e.g., syntax and semantics), and (2) how these uses vary across linguistic contexts (i.e., to model polysemy). Our word vectors are learned functions of the internal states of a deep bidirectional language model (biLM), which is pre-trained on a large text corpus. We show that these representations can be easily added to existing models and significantly improve the state of the art across six challenging NLP problems, including question answering, textual entailment and sentiment analysis. We also present an analysis showing that exposing the deep internals of the pre-trained network is crucial, allowing downstream models to mix different types of semi-supervision signals."







Tags:
deep learning
natural language processing
nlp
elmo
word embeddings
elmo nlp
elmo embedding