Title | Musings on Deep Learning: Properties of SGD |
Publication Type | CBMM Memos |
Year of Publication | 2017 |
Authors | Zhang, C, Liao, Q, Rakhlin, A, Sridharan, K, Miranda, B, Golowich, N, Poggio, T |
Date Published | 04/2017 |
Abstract | [formerly titled "Theory of Deep Learning III: Generalization Properties of SGD"] In Theory III we characterize with a mix of theory and experiments the generalization properties of Stochastic Gradient Descent in overparametrized deep convolutional networks. We show that Stochastic Gradient Descent (SGD) selects with high probability solutions that 1) have zero (or small) empirical error, 2) are degenerate as shown in Theory II and 3) have maximum generalization. |
DSpace@MIT |
Download:
CBMM Memo 067 v2 (revised 7/19/2017) CBMM Memo 067 v3 (revised 9/15/2017) CBMM Memo 067 v4 (revised 12/26/2017)
CBMM Memo No:
067 Research Area:
CBMM Relationship:
- CBMM Funded