# Optimal Restricted Isometry Condition for Exact Sparse Recovery with Orthogonal Least Squares

Byonghyo Shim, Junhan Kim## Date & Time

01:00 am – 01:00 am

Orthogonal least squares (OLS) is a classic algorithm for sparse recovery, function approximation, and subset selection. In this paper, we analyze the performance guarantee of the OLS algorithm. Specifically, we show that OLS guarantees the exact reconstruction of any $K$-sparse vector in $K$ iterations, provided that a sensing matrix has unit $\ell_{2}$-norm columns and satisfies the restricted isometry property (RIP) of order $K+1$ with \begin{align*} \delta_{K+1} &<C_{K} = \begin{cases} \frac{1}{\sqrt{K}}, & K=1, \\ \frac{1}{\sqrt{K+\frac{1}{4}}}, & K=2, \\ \frac{1}{\sqrt{K+\frac{1}{16}}}, & K=3, \\ \frac{1}{\sqrt{K}}, & K \ge 4. \end{cases} \end{align*} Furthermore, we show that the proposed guarantee is optimal in the sense that if $\delta_{K+1} \ge C_{K}$, then there exists a counterexample for which OLS fails the recovery.

01:00 am – 01:00 am

Compressed Sensing

Optimal Restricted Isometry Condition for Exact Sparse Recovery with Orthogonal Least Squares

Some Performance Guarantees of Global LASSO with Local Assumptions for Convolutional Sparse Design Matrices

Quantized Corrupted Sensing with Random Dithering

Macroscopic Analysis of Vector Approximate Message Passing in a Model Mismatch Setting

A Novel B-MAP Proxy for Greedy Sparse Signal Recovery Algorithms