Show simple item record

dc.contributor.advisorTomaso Poggio
dc.contributor.authorRosasco, Lorenzoen_US
dc.contributor.authorVerri, Alessandroen_US
dc.contributor.authorSantoro, Matteoen_US
dc.contributor.authorMosci, Sofiaen_US
dc.contributor.authorVilla, Silviaen_US
dc.contributor.otherCenter for Biological and Computational Learning (CBCL)en_US
dc.date.accessioned2009-10-14T21:00:10Z
dc.date.available2009-10-14T21:00:10Z
dc.date.issued2009-10-14
dc.identifier.urihttp://hdl.handle.net/1721.1/49428
dc.description.abstractIn this paper we propose a general framework to characterize and solve the optimization problems underlying a large class of sparsity based regularization algorithms. More precisely, we study the minimization of learning functionals that are sums of a differentiable data term and a convex non differentiable penalty. These latter penalties have recently become popular in machine learning since they allow to enforce various kinds of sparsity properties in the solution. Leveraging on the theory of Fenchel duality and subdifferential calculus, we derive explicit optimality conditions for the regularized solution and propose a general iterative projection algorithm whose convergence to the optimal solution can be proved. The generality of the framework is illustrated, considering several examples of regularization schemes, including l1 regularization (and several variants), multiple kernel learning and multi-task learning. Finally, some features of the proposed framework are empirically studied.en_US
dc.format.extent28 p.en_US
dc.relation.ispartofseriesMIT-CSAIL-TR-2009-050
dc.relation.ispartofseriesCBCL-282
dc.subjectcomputationen_US
dc.subjectlearningen_US
dc.titleIterative Projection Methods for Structured Sparsity Regularizationen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record