Search
Now showing items 1-10 of 20
On the Forgetting of College Academice: at "Ebbinghaus Speed"?
(Center for Brains, Minds and Machines (CBMM), 2017-06-20)
How important are Undergraduate College Academics after graduation? How much do we actually remember after we leave the college classroom, and for how long? Taking a look at major University ranking methodologies one can ...
Musings on Deep Learning: Properties of SGD
(Center for Brains, Minds and Machines (CBMM), 2017-04-04)
[previously titled "Theory of Deep Learning III: Generalization Properties of SGD"] In Theory III we characterize with a mix of theory and experiments the generalization properties of Stochastic Gradient Descent in ...
Symmetry Regularization
(Center for Brains, Minds and Machines (CBMM), 2017-05-26)
The properties of a representation, such as smoothness, adaptability, generality, equivari- ance/invariance, depend on restrictions imposed during learning. In this paper, we propose using data symmetries, in the sense of ...
Theory II: Landscape of the Empirical Risk in Deep Learning
(Center for Brains, Minds and Machines (CBMM), arXiv, 2017-03-30)
Previous theoretical work on deep learning and neural network optimization tend to focus on avoiding saddle points and local minima. However, the practical observation is that, at least for the most successful Deep ...
Theory of Deep Learning III: explaining the non-overfitting puzzle
(arXiv, 2017-12-30)
THIS MEMO IS REPLACED BY CBMM MEMO 90
A main puzzle of deep networks revolves around the absence of overfitting despite overparametrization and despite the large capacity demonstrated by zero training error on randomly ...
Exact Equivariance, Disentanglement and Invariance of Transformations
(2017-12-31)
Invariance, equivariance and disentanglement of transformations are important topics in the field of representation learning. Previous models like Variational Autoencoder [1] and Generative Adversarial Networks [2] attempted ...
Human-like Learning: A Research Proposal
(2017-09-28)
We propose Human-like Learning, a new machine learning paradigm aiming at training generalist AI systems in a human-like manner with a focus on human-unique skills.
On the Robustness of Convolutional Neural Networks to Internal Architecture and Weight Perturbations
(Center for Brains, Minds and Machines (CBMM), arXiv, 2017-04-03)
Deep convolutional neural networks are generally regarded as robust function approximators. So far, this intuition is based on perturbations to external stimuli such as the images to be classified. Here we explore the ...
Object-Oriented Deep Learning
(Center for Brains, Minds and Machines (CBMM), 2017-10-31)
We investigate an unconventional direction of research that aims at converting neural networks, a class of distributed, connectionist, sub-symbolic models into a symbolic level with the ultimate goal of achieving AI ...
Discriminate-and-Rectify Encoders: Learning from Image Transformation Sets
(Center for Brains, Minds and Machines (CBMM), arXiv, 2017-03-13)
The complexity of a learning task is increased by transformations in the input space that preserve class identity. Visual object recognition for example is affected by changes in viewpoint, scale, illumination or planar ...