Junction Tree Variational Autoencoder for Molecular Graph Generation | TDLS

Published on ● Video Link: https://www.youtube.com/watch?v=QFRv_lOWeKI



Duration: 1:23:23
4,606 views
72


Toronto Deep Learning Series, 24 September 2018

For slides and more information, visit https://tdls.a-i.science/events/2018-09-24/

Paper Review: https://arxiv.org/abs/1802.04364

Speaker: Rouzbeh Afrasiabi (Multiple Sclerosis Society of Canada)
Organizer: https://www.linkedin.com/in/amirfz/

Host: Microsoft Canada
Date: Sep 24th, 2018

Junction Tree Variational Autoencoder for Molecular Graph Generation

We seek to automate the design of molecules based on specific chemical properties. In computational terms, this task involves continuous embedding and generation of molecular graphs. Our primary contribution is the direct realization of molecular graphs, a task previously approached by generating linear SMILES strings instead of graphs. Our junction tree variational autoencoder generates molecular graphs in two phases, by first generating a tree-structured scaffold over chemical substructures, and then combining them into a molecule with a graph message passing network. This approach allows us to incrementally expand molecules while maintaining chemical validity at every step. We evaluate our model on multiple tasks ranging from molecular generation to optimization. Across these tasks, our model outperforms previous state-of-the-art baselines by a significant margin.




Other Videos By LLMs Explained - Aggregate Intellect - AI.SCIENCE


2018-11-27Neural Image Caption Generation with Visual Attention (discussion) | AISC
2018-11-17PGGAN | Progressive Growing of GANs for Improved Quality, Stability, and Variation (part 2) | AISC
2018-11-16PGGAN | Progressive Growing of GANs for Improved Quality, Stability, and Variation (part 1) | AISC
2018-11-16(Original Paper) Latent Dirichlet Allocation (discussions) | AISC Foundational
2018-11-15(Original Paper) Latent Dirichlet Allocation (algorithm) | AISC Foundational
2018-10-31[Transformer] Attention Is All You Need | AISC Foundational
2018-10-25[Original attention] Neural Machine Translation by Jointly Learning to Align and Translate | AISC
2018-10-16[StackGAN++] Realistic Image Synthesis with Stacked Generative Adversarial Networks | AISC
2018-10-11Bayesian Deep Learning on a Quantum Computer | TDLS Author Speaking
2018-10-02Prediction of Cardiac arrest from physiological signals in the pediatric ICU | TDLS Author Speaking
2018-09-24Junction Tree Variational Autoencoder for Molecular Graph Generation | TDLS
2018-09-19Reconstructing quantum states with generative models | TDLS Author Speaking
2018-09-13All-optical machine learning using diffractive deep neural networks | TDLS
2018-09-05Recurrent Models of Visual Attention | TDLS
2018-08-28Eve: A Gradient Based Optimization Method with Locally and Globally Adaptive Learning Rates | TDLS
2018-08-20TDLS: Large-Scale Unsupervised Deep Representation Learning for Brain Structure
2018-08-14Principles of Riemannian Geometry in Neural Networks | TDLS
2018-08-07Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond | TDLS
2018-07-30Program Language Translation Using a Grammar-Driven Tree-to-Tree Model | TDLS
2018-07-23Explainable Neural Networks based on Additive Index Models | TDLS
2018-07-18TMLS2018 - Machine Learning in Production, Panel Discussion



Tags:
Science & Technology
Deep Learning
Chemistry
Variational Autoencoder
Molecular Graph Generation
neural chemistry
junction tree