# Sparse recovery with Brownian sensing.

Alexandra Carpentier, Odalric-Ambrym Maillard, Rémi Munos.
In Proceedings of the 24th conference on advances in Neural Information Processing Systems, NIPS ’11, 2011.

 Abstract: We consider the problem of recovering the parameter α ∈ R^K of a sparse function f (i.e. the number of non-zero entries of α is small compared to the number K of features) given noisy evaluations of f at a set of well-chosen sampling points. We introduce an additional randomization process, called Brownian sensing, based on the computation of stochastic integrals, which produces a Gaussian sensing matrix, for which good recovery properties are proven, independently on the number of sampling points N , even when the features are arbitrarily non-orthogonal. Under the assumption that f is Hölder continuous with exponent at least 1/2, we provide an estimate â of the parameter such that ||α − â||_2 = O( ||η||_2/ sqrt(N)), where η is the observation noise. The method uses a set of sampling points uniformly distributed along a one-dimensional curve selected according to the features. We report numerical experiments illustrating our method.

You can dowload the paper from the NIPS webiste (here) or from the HAL online open depository* (here).

 Bibtex: @incollection{CarpentierMM11, title = {Sparse Recovery with Brownian Sensing}, author = {Alexandra Carpentier and Odalric-ambrym Maillard and R\'{e}mi Munos}, booktitle = {Advances in Neural Information Processing Systems 24}, editor = {J. Shawe-Taylor and R.S. Zemel and P.L. Bartlett and F. Pereira and K.Q. Weinberger}, pages = {1782–1790}, year = {2011}, publisher = {Curran Associates, Inc.} }

# Compressed Least Squares Regression

[see  Linear regression with random projections, 2012 for corrections]

Odalric-Ambrym Maillard, Rémi Munos.
In NIPS ’09, pages 1213–1221, 2009.