to analysishas been performed for both matrices satisfying the

to that of Theorem 1.9. Werefer the reader to Chapter 8 for further details on the theoretical properties ofthresholding algorithms, and focus here on OMP.The simplest guarantees for OMP state that for exactly k-sparse x with noisefreemeasurements y = Ax, OMP will recover x exactly in k iterations. This analysishas been performed for both matrices satisfying the RIP 75 and matriceswith bounded coherence 220. In both results, however, the required constantsare relatively small, so that the results only apply when m = O(k2 log(n)).There have been many e orts to improve upon these basic results. As oneexample, in 173 the required number of measurements is reduced to m =O(k1:6 log(n)) by allowing OMP to run for more than k iterations. More recently,it has been shown that this can be even further relaxed to the more familiarm = O(k log(n)) and that OMP is stable with respect to bounded noise, yieldinga guarantee along the lines of Theorem 1.9 but only for exactly sparse signals254. Both of these analyses have exploited the RIP. There has also beenrecent progress in using the RIP to analyze the performance of OMP on nonsparsesignals 10. At present, however, RIP-based analysis of OMP remains atopic of ongoing work.Note that all of the above e orts have aimed at establishing uniform guarantees(although often restricted to exactly sparse signals). In light of our discussionof probabilistic guarantees in Section 1.5.3, one might expect to see improvementsby considering less restrictive guarantees. As an example, it has beenshown that by considering random matrices for A OMP can recover k-sparsesignals in k iterations with high probability using only m = O(k log(n)) measurements222. Similar improvements are also possible by placing restrictionson the smallest nonzero value of the signal, as in 88. Furthermore, such restrictionsalso enable near-optimal recovery guarantees when the measurements arecorrupted by Gaussian noise 18.