NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Local Analysis...

Subscribers:
348,000
Published on ● Video Link: https://www.youtube.com/watch?v=lxQBswrFjtA



Duration: 30:11
2,512 views
3


Sparse Representation and Low-rank Approximation Workshop at NIPS 2011
Invited Talk: Local Analysis of Sparse Coding in the Presence of Noise by Rodolphe Jenatton, INRIA / Ecole Normale Supérieure

Abstract: A popular approach within the signal processing and machine learning communities consists in modelling signals as sparse linear combinations of atoms selected from a learned dictionary. While this paradigm has led to numerous empirical successes in various fields ranging from image to audio processing, there have only been a few theoretical arguments supporting these evidences. In particular, sparse coding, or sparse dictionary learning, relies on a non-convex procedure whose local minima have not been fully analyzed yet. In this paper, we consider a probabilistic model of sparse signals, and show that, with high probability, sparse coding admits a local minimum around the reference dictionary generating the signals. Our study takes into account the case of over complete dictionaries and noisy signals, thus extending previous work limited to noiseless settings and/or under-complete dictionaries. The analysis we conduct is non-asymptotic and makes it possible to understand how the key quantities of the problem, such as the coherence or the level of noise, are allowed to scale with respect to the dimension of the signals, the number of atoms, the sparsity and the number of observations.




Other Videos By Google TechTalks


2012-02-13NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Machine Learning's Role...
2012-02-13NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Fast Cross-Validation...
2012-02-13NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: High-Performance Computing...
2012-02-13NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Towards Human Behavior...
2012-02-13NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Parallelizing Training ...
2012-02-13NIPS 2011 Big Learning Workshop - Algorithms, Systems, & Tools for Learning at Scale: NeuFlow...
2012-02-13NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Bootstrapping Big Data...
2012-02-13NIPS 2011 Big Learning - Algorithms, Systems, & Tools Workshop: Big Machine Learning...
2012-02-09NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Dictionary-Dependent Penalties...
2012-02-09NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Robust Sparse Analysis...
2012-02-09NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Local Analysis...
2012-02-09NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Recovery of a Sparse...
2012-02-08NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Fast global convergence...
2012-02-08NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: For Transform Invariant..
2012-02-08NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Fast Approximation...
2012-02-08NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Online Spectral...
2012-02-08NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Fast & Memory...
2012-02-08NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Divide-and-Conquer ...
2012-02-08NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Coordinate Descent...
2012-02-08NIPS 2011 Sparse Representation & Low-rank Approximation Workshop: Automatic Relevance ...
2012-02-07NIPS 2011 Music and Machine Learning Workshop: Multi-Timescale Principal Mel Spectral Components..



Tags:
new
sparse
morning