MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • Center for Brains, Minds & Machines
  • Publications
  • CBMM Memo Series
  • View Item
  • DSpace@MIT Home
  • Center for Brains, Minds & Machines
  • Publications
  • CBMM Memo Series
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

SGD Noise and Implicit Low-Rank Bias in Deep Neural Networks

Author(s)
Galanti, Tomer; Poggio, Tomaso
Thumbnail
DownloadCBMM-Memo-134.pdf (803.6Kb)
Metadata
Show full item record
Abstract
We analyze deep ReLU neural networks trained with mini-batch stochastic gradient decent and weight decay. We prove that the source of the SGD noise is an implicit low rank constraint across all of the weight matrices within the network. Furthermore, we show, both theoretically and empirically, that when training a neural network using Stochastic Gradient Descent (SGD) with a small batch size, the resulting weight matrices are expected to be of small rank. Our analysis relies on a minimal set of assumptions and the neural networks may include convolutional layers, residual connections, as well as batch normalization layers.
Date issued
2022-03-28
URI
https://hdl.handle.net/1721.1/141380
Publisher
Center for Brains, Minds and Machines (CBMM)
Series/Report no.
CBMM Memo;134

Collections
  • CBMM Memo Series

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.