Search
Now showing items 11-17 of 17
Building machines that learn and think like people
(Center for Brains, Minds and Machines (CBMM), arXiv, 2016-04-01)
Recent progress in artificial intelligence (AI) has renewed interest in building systems that learn and think like people. Many advances have come from using deep neural networks trained end-to-end in tasks such as object ...
Learning Real and Boolean Functions: When Is Deep Better Than Shallow
(Center for Brains, Minds and Machines (CBMM), arXiv, 2016-03-08)
We describe computational tasks - especially in vision - that correspond to compositional/hierarchical functions. While the universal approximation property holds both for hierarchical and shallow networks, we prove that ...
Do You See What I Mean? Visual Resolution of Linguistic Ambiguities
(Center for Brains, Minds and Machines (CBMM), arXiv, 2016-06-10)
Understanding language goes hand in hand with the ability to integrate complex contextual information obtained via perception. In this work, we present a novel task for grounded language understanding: disambiguating a ...
Group Invariant Deep Representations for Image Instance Retrieval
(Center for Brains, Minds and Machines (CBMM), 2016-01-11)
Most image instance retrieval pipelines are based on comparison of vectors known as global image descriptors between a query image and the database images. Due to their success in large scale image classification, ...
Theory I: Why and When Can Deep Networks Avoid the Curse of Dimensionality?
(Center for Brains, Minds and Machines (CBMM), arXiv, 2016-11-23)
[formerly titled "Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionality: a Review"]
The paper reviews and extends an emerging body of theoretical results on deep learning including the ...
Deep vs. shallow networks : An approximation theory perspective
(Center for Brains, Minds and Machines (CBMM), arXiv, 2016-08-12)
The paper briefly reviews several recent results on hierarchical architectures for learning from examples, that may formally explain the conditions under which Deep Convolutional Neural Networks perform much better in ...
Universal Dependencies for Learner English
(Center for Brains, Minds and Machines (CBMM), arXiv, 2016-08-01)
We introduce the Treebank of Learner English (TLE), the first publicly available syntactic treebank for English as a Second Language (ESL). The TLE provides manually annotated POS tags and Universal Dependency (UD) trees ...