Title | For HyperBFs AGOP is a greedy approximation to gradient descent |
Publication Type | CBMM Memos |
Year of Publication | 2024 |
Authors | Gan, Y, Poggio, T |
Number | 148 |
Date Published | 07/2024 |
Abstract | The Average Gradient Outer Product (AGOP) provides a novel approach to feature learning in neural networks. We applied both AGOP and Gradient Descent to learn the matrix M in the Hyper Basis Function Network (HyperBF) and observed very similar performance. We show formally that AGOP is a greedy approximation of gradient descent. |
DSpace@MIT |
Download:
CBMM-Memo-148.pdf
CBMM Memo No:
148 Associated Module:
CBMM Relationship:
- CBMM Funded