
BERT & NLP Explained
Speaker: Mitchell Wong, Data Scientist at Microsoft
Host: Abir Mokbel, Stream Owner at Aggregate Intellect
Motivation:
This video walks us through BERT, an open source machine learning framework for natural language processing (NLP). BERT is designed to help computers understand the meaning of ambiguous language in text by using surrounding text to establish context. This video discusses NLP, language models, how BERT works, MLM and NSP, real life applications, and limitations and future enhancements.
Website: https://ai.science/?utm_source=youtube&utm_medium=abir&utm_campaign=aibrief
DG: https://ai.science/dg?utm_source=youtube&utm_medium=abir&utm_campaign=aibrief
Join slack: https://join.slack.com/t/aisc-to/shared_invite/zt-f5zq5l35-PSIJTFk4v60FML177PgsPg?utm_source=youtube&utm_medium=abir&utm_campaign=aibrief
Concepts & Graphic credits:
Next word: Holtzman et al 2019
Markov: https://www.cs.princeton.edu/courses/archive/spring05/cos126/assignments/markov.html#:~:text=Markov%20model%20of%20natural%20language.%20a%20simple%20mathematical,of%20occurrences%20of%20each%20letter%20in%20that%20text,
Pre-training and fine-tuning: https://jalammar.github.io/illustrated-bert/
Self-attention: https://ai.googleblog.com/2017/08/transformer-novel-neural-network.html
Disclaimer from Host: "All opinions expressed in this video are my own"