Search
Now showing items 1-5 of 5
Theory IIIb: Generalization in Deep Networks
(Center for Brains, Minds and Machines (CBMM), arXiv.org, 2018-06-29)
The general features of the optimization problem for the case of overparametrized nonlinear networks have been clear for a while: SGD selects with high probability global minima vs local minima. In the overparametrized ...
Classical generalization bounds are surprisingly tight for Deep Networks
(Center for Brains, Minds and Machines (CBMM), 2018-07-11)
Deep networks are usually trained and tested in a regime in which the training classification error is not a good predictor of the test error. Thus the consensus has been that generalization, defined as convergence of the ...
When Is Handcrafting Not a Curse?
(2018-12-31)
Recently, with the proliferation of deep learning, there is a strong trend of abandoning handcrafted sys- tems/features in machine learning and AI by replacing them with “end-to-end” systems “learned from scratch”. These ...
Representations That Learn vs. Learning Representations
(2018-12-31)
During the last decade, we have witnessed tremendous progress in Machine Learning and especially the area of Deep Learning, a.k.a. “Learning Representations” (LearnRep for short). There is even an International Conference ...
Biologically-Plausible Learning Algorithms Can Scale to Large Datasets
(Center for Brains, Minds and Machines (CBMM), 2018-09-27)
The backpropagation (BP) algorithm is often thought to be biologically implausible in the brain. One of the main reasons is that BP requires symmetric weight matrices in the feedforward and feed- back pathways. To address ...