Deep Convolutional Networks are Hierarchical Kernel Machines
Author(s)
Anselmi, Fabio; Rosasco, Lorenzo; Tan, Cheston; Poggio, Tomaso
DownloadCBMM-Memo-035.pdf (975.6Kb)
Terms of use
Metadata
Show full item recordAbstract
We extend i-theory to incorporate not only pooling but also rectifying nonlinearities in an extended HW module (eHW) designed for supervised learning. The two operations roughly correspond to invariance and selectivity, respectively. Under the assumption of normalized inputs, we show that appropriate linear combinations of rectifying nonlinearities are equivalent to radial kernels. If pooling is present an equivalent kernel also exist. Thus present-day DCNs (Deep Convolutional Networks) can be exactly equivalent to a hierarchy of kernel machines with pooling and non-pooling layers. Finally, we describe a conjecture for theoretically understanding hierarchies of such modules. A main consequence of the conjecture is that hierarchies of eHW modules minimize memory requirements while computing a selective and invariant representation.
Date issued
2015-08-05Publisher
Center for Brains, Minds and Machines (CBMM), arXiv
Citation
arXiv:1508.01084
Series/Report no.
CBMM Memo Series;035
Keywords
i-theory, extended HW module (eHW), Invariance, Selectivity, Hierarchy, Machine Learning
Collections
The following license files are associated with this item: