Furong Huang: Discovery of Latent Factors in High-dimensional Data Using Tensor Methods

Channel:
Subscribers:
2,520
Published on ● Video Link: https://www.youtube.com/watch?v=4Zuskr4ySqI



Duration: 49:15
534 views
0


A talk by Furong Huang at the Quantum Machine Learning Workshop, hosted September 24-28, 2018 by the Joint Center for Quantum Information and Computer Science at the University of Maryland (QuICS).

Abstract: Latent or hidden variable models have applications in almost every domain, e.g., social network analysis, natural language processing, computer vision and computational biology. Training latent variable models is challenging due to non-convexity of the likelihood objective function. An alternative method is based on the spectral decomposition of low order moment matrices and tensors. This versatile framework is guaranteed to estimate the correct model consistently. I will discuss my results on convergence to globally optimal solution for stochastic gradient descent, despite non-convexity of the objective. I will then discuss large-scale implementations (which are highly parallel and scalable) of spectral methods, carried out on CPU/GPU and Spark platforms. We obtain a gain in both accuracies and in running times by several orders of magnitude compared to the state-of-art variational methods. I will discuss the following applications in detail: (1) learning hidden user commonalities (communities) in social networks, and (2) learning sentence embeddings for paraphrase detection using convolutional models.




Other Videos By QuICS


2019-03-29Serge Fehr: Security of the Fiat-Shamir Transformation in the Quantum Random Oracle Model
2018-10-31Mario Szegedy: A New Algorithm for Product Decomposition in Quantum Signal Processing
2018-10-31Scott Aaronson: Gentle Measurement of Quantum States and Differential Privacy
2018-10-31Seth Lloyd: Quantum Generative Adversarial Networks
2018-10-31Norbert Linke: Quantum Machine Learning with Trapped Ions
2018-10-31Kristan Temme: Supervised Learning with Quantum Enhanced Feature Spaces
2018-10-31Soheil Feizi: Generative Adversarial Networks: Formulation, Design and Computation
2018-10-31Nathan Wiebe: Optimizing Quantum Optimization Algorithms via Faster Quantum Gradient Computation
2018-10-31Rolando Somma: Quantum Algorithms for Systems of Linear Equations
2018-10-31Anupam Praksah: A Quantum Interior Point Method for LPs and SDPs
2018-10-31Furong Huang: Discovery of Latent Factors in High-dimensional Data Using Tensor Methods
2018-10-31Fernando Brandao: Quantum Speed-up for SDPs and Kernel Learning
2018-10-31Srinivasan Arunachalam: Strengths and weaknesses of quantum examples for learning
2018-10-31Vedran Dunjko: A Route towards Quantum-Enhanced Artificial Intelligence
2018-10-31Elad Hazan: Efficient Optimization for Machine Learning: Beyond Stochastic Gradient Descent
2017-10-11John Preskill: QEC in 2017—Past, present, and future
2017-10-11Sepehr Nezami: Quantum Error Correction of Reference Frame Information
2017-10-11Anirudh Krishna: Performance of hyperbolic surface codes
2017-10-11Brian Swingle: Entanglement, Wormholes, and Quantum Error Correction
2017-10-11Christa Flühmann: Preparation of Grid state qubits by sequential modular position measurements
2017-10-11Matteo Marinelli: Repetitive stabilizer readout with conditional feedback



Tags:
machine learning