Show simple item record

dc.contributor.authorVito, Ernesto De
dc.contributor.authorCaponnetto, Andrea
dc.date.accessioned2005-12-22T02:28:54Z
dc.date.available2005-12-22T02:28:54Z
dc.date.issued2005-05-16
dc.identifier.otherMIT-CSAIL-TR-2005-031
dc.identifier.otherAIM-2005-015
dc.identifier.otherCBCL-249
dc.identifier.urihttp://hdl.handle.net/1721.1/30543
dc.description.abstractWe show that recent results in [3] on risk bounds for regularized least-squares on reproducing kernel Hilbert spaces can be straightforwardly extended to the vector-valued regression setting. We first briefly introduce central concepts on operator-valued kernels. Then we show how risk bounds can be expressed in terms of a generalization of effective dimension.
dc.format.extent17 p.
dc.format.extent12090406 bytes
dc.format.extent642646 bytes
dc.format.mimetypeapplication/postscript
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.relation.ispartofseriesMassachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory
dc.subjectAI
dc.subjectoptimal rates
dc.subjectreproducing kernel Hilbert space
dc.subjecteffective dimension
dc.titleRisk Bounds for Regularized Least-squares Algorithm with Operator-valued kernels


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record