Streaming Normalization: Towards Simpler and More Biologically-plausible Normalizations for Online and Recurrent Learning
Author(s)
Liao, Qianli; Kawaguchi, Kenji; Poggio, Tomaso
DownloadCBMM-Memo-057.pdf (1.268Mb)
Terms of use
Metadata
Show full item recordAbstract
We systematically explored a spectrum of normalization algorithms related to Batch Normalization (BN) and propose a generalized formulation that simultaneously solves two major limitations of BN: (1) online learning and (2) recurrent learning. Our proposal is simpler and more biologically-plausible. Unlike previous approaches, our technique can be applied out of the box to all learning scenarios (e.g., online learning, batch learning, fully-connected, convolutional, feedforward, recurrent and mixed — recurrent and convolutional) and compare favorably with existing approaches. We also propose Lp Normalization for normalizing by different orders of statistical moments. In particular, L1 normalization is well-performing, simple to implement, fast to compute, more biologically-plausible and thus ideal for GPU or hardware implementations.
Date issued
2016-10-19Publisher
Center for Brains, Minds and Machines (CBMM), arXiv
Citation
arXiv:1610.06160v1
Series/Report no.
CBMM Memo Series;057
Keywords
Batch Normalization (BN), recurrent learning, Lp Normalization
Collections
The following license files are associated with this item: