Neural Ordinary Differential Equations - part 2 (results & discussion) | AISC

Published on ● Video Link: https://www.youtube.com/watch?v=2pP0Puj15Nc



Category:
Discussion
Duration: 42:05
2,934 views
39


Toronto Deep Learning Series, 14-Jan-2019
https://tdls.a-i.science/events/2019-01-14

Paper: https://arxiv.org/abs/1806.07366

Discussion Panel: Jodie Zhu, Helen Ngo, Lindsay Brin

Host: SAS Institute Canada

NEURAL ORDINARY DIFFERENTIAL EQUATIONS

We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is computed using a black-box differential equation solver. These continuous-depth models have constant memory cost, adapt their evaluation strategy to each input, and can explicitly trade numerical precision for speed. We demonstrate these properties in continuous-depth residual networks and continuous-time latent variable models. We also construct continuous normalizing flows, a generative model that can train by maximum likelihood, without partitioning or ordering the data dimensions. For training, we show how to scalably backpropagate through any ODE solver, without access to its internal operations. This allows end-to-end training of ODEs within larger models.




Other Videos By LLMs Explained - Aggregate Intellect - AI.SCIENCE


2019-02-19Computational prediction of diagnosis & feature selection on mesothelioma patient records | AISC
2019-02-18Support Vector Machine (original paper) | AISC Foundational
2019-02-11Tensor Field Networks | AISC
2019-02-07ACAI: Understanding and Improving Interpolation in Autoencoders via an Adversarial Regularizer
2019-02-04Code Review: Transformer - Attention Is All You Need | AISC
2019-02-04[StyleGAN] A Style-Based Generator Architecture for GANs, part2 (results and discussion) | TDLS
2019-02-04[StyleGAN] A Style-Based Generator Architecture for GANs, part 1 (algorithm review) | TDLS
2019-02-04TDLS: Learning Functional Causal Models with GANs - part 1 (algorithm review)
2019-02-04TDLS: Learning Functional Causal Models with GANs - part 2 (results and discussion)
2019-02-04Neural Ordinary Differential Equations - part 1 (algorithm review) | AISC
2019-02-04Neural Ordinary Differential Equations - part 2 (results & discussion) | AISC
2019-02-04Parallel Collaborative Filtering for the Netflix Prize (algorithm review) | AISC Foundational
2019-02-04Parallel Collaborative Filtering for the Netflix Prize (results & discussion) AISC Foundational
2019-01-14TDLS - Announcing Fast Track Stream
2019-01-09Extracting Biologically Relevant Latent Space from Cancer Transcriptomes \w VAEs(discussions) I AISC
2019-01-09Extracting Biologically Relevant Latent Space from Cancer Transcriptomes \w VAEs (algorithm) | AISC
2019-01-08[original backprop paper] Learning representations by back-propagating errors (part1) | AISC
2019-01-08[original backprop paper] Learning representations by back-propagating errors (part2) | AISC
2018-12-16Automated Deep Learning: Joint Neural Architecture and Hyperparameter Search (discussions) | AISC
2018-12-16Automated Deep Learning: Joint Neural Architecture and Hyperparameter Search (algorithm) | AISC
2018-12-09Automated Vulnerability Detection in Source Code Using Deep Learning (discussions) | AISC



Tags:
neural nets
residual nets
differential equations
deep learning
neural ordinary differential equations
neural differential equations
neural ode
ODEnet
ODE net
ordinary differential equations