Graphs, Games, Sparsity, and Signal Processing
G.3.2
Lecture
Compressed Sensing

Some Performance Guarantees of Global LASSO with Local Assumptions for Convolutional Sparse Design Matrices

Avishek Ghosh, Kannan Ramchandran

Date & Time

01:00 am – 01:00 am

Abstract

We analyze the performance of the LASSO algorithm (basis pursuit, Tibshirani et. al, '96) for a class of structured matrices dubbed as convolutional sparse matrix. Analyzing such matrices is of paramount interest since in many signal processing applications (including computer vision, image and audio processing), a global analysis of the underlying signal often entails understanding the behavior of convolutional sparse matrix. We show that LASSO ($\ell_1$ regularized least squares) with such matrices succeeds under a constraint on local sparsity, as opposed to global sparsity. This conversion from global to local constraint has crucial significance in the above mentioned applications. Under sufficiency conditions like Restricted Eigen-value (RE) and Exact Recovery Coefficient (ERC), we obtain the prediction (in $\ell_2$ norm) and estimation error rate for LASSO estimator with local constraints. Furthermore, we obtain an estimation error rate for LASSO estimator in $\ell_{\infty}$ norm under a gaussian noise model.


Presenters

Avishek Ghosh

University of California, Berkeley

Kannan Ramchandran

University of California, Berkeley

Session Chair

Sundeep Rangan

New York University