
Why Transformers Work (Natural Language Processing)
Channel:
Subscribers:
725,000
Published on ● Video Link: https://www.youtube.com/watch?v=QHkpGtDySqM
This will be a technical talk where I'll explain the inner workings of the machine learning algorithms inside of Rasa. In particular I'll talk about why the transformer has become a part in many of our algorithms and has replaced RNNs. These include use-cases in natural language processing but also in dialogue handling.
You'll see a live demo of a typical error that an LSTM would make but a transformer wouldn't. The algorithms are explained with calm diagrams and very little maths.
PUBLICATION PERMISSIONS:
Original video was published with the Creative Commons Attribution license (reuse allowed). Link: https://www.youtube.com/watch?v=cFUSjztXbL8
Other Videos By Coding Tech
2021-02-20 | Translating Chinese Into Morse Code! |
2021-02-18 | 17 Things Developers Need to Know About Databases |
2021-02-16 | WebAssembly: Digging a Bit Deeper |
2021-02-08 | An introduction to WebAssembly |
2021-02-07 | A Dev's Guide to CSS Grid |
2021-02-07 | Hacking the Hybrid Cloud |
2021-02-03 | Machine Learning for JavaScript Developers 101 |
2021-02-02 | New Performance Features: Make Your Pages Faster |
2021-02-02 | VS Code and Data Science Tools |
2021-02-01 | Blockchain Domains + IPFS = Decentralized Websites |
2021-02-01 | Why Transformers Work (Natural Language Processing) |
2021-02-01 | The Past, Present, and Future of CSS-in-JS |
2021-01-29 | Parallel Processing in Python || Aaron Richter |
2021-01-29 | Linux Memory Management at Scale |
2021-01-27 | VS Code Productivity Tips and Tricks |
2021-01-25 | .NET Productivity: Tips and Tricks |
2021-01-24 | What Not to Do When Starting a New Project |
2021-01-24 | The Future of Quantum Computing |
2021-01-23 | What is Bitcoin and Blockchain (by Charles Hoskinson) |
2021-01-23 | C++ As A Second Language |
2021-01-19 | Mo'Problems, Mo'Nads |
Tags:
machine learning
natural language processing
transformers