Principles of Riemannian Geometry in Neural Networks | TDLS

Published on ● Video Link: https://www.youtube.com/watch?v=IPrNIjA4AWE



Duration: 1:38:03
13,979 views
229


Toronto Deep Learning Series, 13 August 2018

For slides and more information, visit https://aisc.ai.science/events/2018-08-13/

Paper Review: https://papers.nips.cc/paper/6873-principles-of-riemannian-geometry-in-neural-networks.pdf

Speaker: https://www.linkedin.com/in/helen-ngo/
Organizer: https://www.linkedin.com/in/amirfz/

Host: Dessa

Paper abstract:
This paper interprets neural networks from a minority perspective with the goal to formalize a theoretical basis grounded in Riemannian geometry, framing neural networks as learning an optimal coordinate representation system of the underlying data manifold where different target classes are linearly separable by hyperplanes. We will show that residual neural networks are finite difference approximations to dynamical systems of first order differential equations. Parallels are drawn between backpropagation and the pullback metric, a linear map between the spaces of 1-forms existing on two smooth manifolds, which acts on the coordinate representation of the metric tensor between network layers.




Other Videos By LLMs Explained - Aggregate Intellect - AI.SCIENCE


2018-10-25[Original attention] Neural Machine Translation by Jointly Learning to Align and Translate | AISC
2018-10-16[StackGAN++] Realistic Image Synthesis with Stacked Generative Adversarial Networks | AISC
2018-10-11Bayesian Deep Learning on a Quantum Computer | TDLS Author Speaking
2018-10-02Prediction of Cardiac arrest from physiological signals in the pediatric ICU | TDLS Author Speaking
2018-09-24Junction Tree Variational Autoencoder for Molecular Graph Generation | TDLS
2018-09-19Reconstructing quantum states with generative models | TDLS Author Speaking
2018-09-13All-optical machine learning using diffractive deep neural networks | TDLS
2018-09-05Recurrent Models of Visual Attention | TDLS
2018-08-28Eve: A Gradient Based Optimization Method with Locally and Globally Adaptive Learning Rates | TDLS
2018-08-20TDLS: Large-Scale Unsupervised Deep Representation Learning for Brain Structure
2018-08-14Principles of Riemannian Geometry in Neural Networks | TDLS
2018-08-07Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond | TDLS
2018-07-30Program Language Translation Using a Grammar-Driven Tree-to-Tree Model | TDLS
2018-07-23Explainable Neural Networks based on Additive Index Models | TDLS
2018-07-18TMLS2018 - Machine Learning in Production, Panel Discussion
2018-07-16Flexible Neural Representation for Physics Prediction | AISC Trending Paper
2018-07-10Connectionist Temporal Classification, Labelling Unsegmented Sequence Data with RNN | TDLS
2018-06-25Learning to Represent Programs with Graphs | TDLS
2018-06-19Quantum generative adversarial networks | TDLS Author Speaking
2018-06-12[SAGAN] Self-Attention Generative Adversarial Networks | TDLS
2018-06-05[ELMo] Deep Contextualized Word Representations | AISC



Tags:
deep learning
riemannian geometry
neural network
representation learning
riemann geometry
riemannian manifold
manifold learning