Linear regression with random projections.

2012, Discussing articles

Odalric-Ambrym Maillard, Rémi Munos.
In Journal of Machine Learning Research 2012, vol:13, pp:2735-2772.

[Download]

Abstract:

We investigate a method for regression that makes use of a randomly generated subspace G_P (of finite dimension P) of a given large (possibly infinite) dimensional function space F, for example, L_{2}([0,1]^d). G_P is defined as the span of P random features that are linear combinations of a basis functions of F weighted by random Gaussian i.i.d.~coefficients. We show practical motivation for the use of this approach, detail the link that this random projections method share with RKHS and Gaussian objects theory and prove, both in deterministic and random design, approximation error bounds when searching for the best regression function in G_P rather than in F, and derive excess risk bounds for a specific regression algorithm (least squares regression in G_P). This paper stresses the motivation to study such methods, thus the analysis developed is kept simple for explanations purpose and leaves room for future developments.

You can dowload the paper from the JMRL website (here) or from the HAL online open depository* (here).

Bibtex:
@article{MaillardMunos12,
title = {{Linear Regression with Random Projections}},
author = {Maillard, Odalric-Ambrym and Munos, R{\’e}mi},
affiliation = {SEQUEL – INRIA Lille – Nord Europe},
pages = {2735-2772},
journal = {Journal of Machine Learning Research},
volume = {13},
year = {2012}
}
Related Publications:
Scrambled Objects for Least-Squares Regression.
Odalric-Ambrym Maillard, Rémi Munos.
In J. Lafferty, C. K. I. Williams, J. Shawe-Taylor, R.S. Zemel, and A. Culotta, editors, Proceedings of the 23rd conference on advances in Neural Information Processing Systems, NIPS ’10, pages 1549–1557, 2010.

Compressed Least Squares Regression.
Odalric-Ambrym Maillard, Rémi Munos.
In Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, and A. Culotta, editors, Proceedings of the 22nd conference on advances in Neural Information Processing Systems,
NIPS ’09, pages 1213–1221, 2009.

Sparse recovery with Brownian sensing.

2011, Discussing articles

Alexandra Carpentier, Odalric-Ambrym Maillard, Rémi Munos.
In Proceedings of the 24th conference on advances in Neural Information Processing Systems, NIPS ’11, 2011.

[Download]

Abstract:

We consider the problem of recovering the parameter α ∈ R^K of a sparse function f (i.e. the number of non-zero entries of α is small compared to the number K of features) given noisy evaluations of f at a set of well-chosen sampling points. We introduce an additional randomization process, called Brownian sensing, based on the computation of stochastic integrals, which produces a Gaussian sensing matrix, for which good recovery properties are proven, independently on the number of sampling points N , even when the features are arbitrarily non-orthogonal. Under the assumption that f is Hölder continuous with exponent at least 1/2, we provide an estimate â of the parameter such that ||α − â||_2 = O( ||η||_2/ sqrt(N)), where η is the observation noise. The method uses a set of sampling points uniformly distributed along a one-dimensional curve selected according to the features. We report numerical experiments illustrating our method.

You can dowload the paper from the NIPS webiste (here) or from the HAL online open depository* (here).

Bibtex:
@incollection{CarpentierMM11,
title = {Sparse Recovery with Brownian Sensing},
author = {Alexandra Carpentier and Odalric-ambrym Maillard and R\'{e}mi Munos},
booktitle = {Advances in Neural Information Processing Systems 24},
editor = {J. Shawe-Taylor and R.S. Zemel and P.L. Bartlett and F. Pereira and K.Q. Weinberger},
pages = {1782–1790},
year = {2011},
publisher = {Curran Associates, Inc.}
}