Show simple item record

dc.contributor.advisorTomaso Poggio
dc.contributor.authorMosci, Sofiaen_US
dc.contributor.authorRosasco, Lorenzoen_US
dc.contributor.authorSantoro, Matteoen_US
dc.contributor.authorVerri, Alessandroen_US
dc.contributor.authorVilla, Silviaen_US
dc.contributor.otherCenter for Biological and Computational Learning (CBCL)en_US
dc.date.accessioned2011-09-26T20:45:09Z
dc.date.available2011-09-26T20:45:09Z
dc.date.issued2011-09-26
dc.identifier.urihttp://hdl.handle.net/1721.1/65964
dc.description.abstractIn this work we are interested in the problems of supervised learning and variable selection when the input-output dependence is described by a nonlinear function depending on a few variables. Our goal is to consider a sparse nonparametric model, hence avoiding linear or additive models. The key idea is to measure the importance of each variable in the model by making use of partial derivatives. Based on this intuition we propose and study a new regularizer and a corresponding least squares regularization scheme. Using concepts and results from the theory of reproducing kernel Hilbert spaces and proximal methods, we show that the proposed learning algorithm corresponds to a minimization problem which can be provably solved by an iterative procedure. The consistency properties of the obtained estimator are studied both in terms of prediction and selection performance. An extensive empirical analysis shows that the proposed method performs favorably with respect to the state-of-the-art.en_US
dc.format.extent38 p.en_US
dc.relation.ispartofseriesMIT-CSAIL-TR-2011-041
dc.relation.ispartofseriesCBCL-303
dc.subjectcomputational learningen_US
dc.subjectmachine learningen_US
dc.titleNonparametric Sparsity and Regularizationen_US
dc.language.rfc3066en-US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record