Show simple item record

dc.contributor.authorLiao, Qianli
dc.contributor.authorPoggio, Tomaso
dc.date.accessioned2018-12-31T15:05:24Z
dc.date.available2018-12-31T15:05:24Z
dc.date.issued2018-12-31
dc.identifier.urihttp://hdl.handle.net/1721.1/119834
dc.description.abstractDuring the last decade, we have witnessed tremendous progress in Machine Learning and especially the area of Deep Learning, a.k.a. “Learning Representations” (LearnRep for short). There is even an International Conference on Learning Representations. Despite the huge success of LearnRep, there is a somewhat overlooked dimension of research that we would like to discuss in this report. We observe there is a chicken-and-egg problem between “learning” and “representations”. In the view of traditional Machine Learning and Deep Learning, “learning” is the “first-class citizen” — a learning system typically starts from scratch 2 and the learning process leads to good “representations”. In contrast to the above view, we propose a concept “Representations That Learn” (RepLearn, or Meta Learning). one can start from a “representation” that is either learned, evolved or even “intelligently designed”. Unlike a system from scratch, this representation already has some functionalities (e.g., reasoning, memorizing, theory of mind, etc., depending on your task). In addition, such a representation must support a completely new level of learning — hence we have a “representation that learns”. Furthermore, one can go more extreme in this direction and define “Hyper-learning” — multiple levels of repre- sentations are formed. Each level of representation supports a level of learning that leads to the representation of next level. Note that this is different from building multiple layers of deep neural networks. Instead, it is similar to how an operating system is implemented: an OS have at least three levels of representations: electrical signals on transistors, machine language, high-level language. We believe RepLearn is similar to how human learns — many representations in our brain are formed before any learning happens (i.e., genetically coded). They serve as prior knowledge of the world and support one level of high-level learning (e.g., memorizing events, learning skills, etc.).en_US
dc.description.sponsorshipThis work is supported by the Center for Brains, Minds and Machines (CBMM), funded by NSF STC award CCF-1231216. We thank Brando Miranda for useful comments and discussion after reading this draft in 06/2017.en_US
dc.language.isoen_USen_US
dc.titleRepresentations That Learn vs. Learning Representationsen_US
dc.typeTechnical Reporten_US
dc.typeWorking Paperen_US
dc.typeOtheren_US
dc.contributor.departmentCenter for Brains, Minds, and Machines


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record