For HyperBFs AGOP is a greedy approximation to gradient descent

TitleFor HyperBFs AGOP is a greedy approximation to gradient descent
Publication TypeCBMM Memos
Year of Publication2024
AuthorsGan, Y, Poggio, T
Number148
Date Published07/2024
Abstract

The Average Gradient Outer Product (AGOP) provides a novel approach to feature learning in neural networks. We applied both AGOP and Gradient Descent to learn the matrix M in the Hyper Basis Function Network (HyperBF) and observed very similar performance. We show formally that AGOP is a greedy approximation of gradient descent.

DSpace@MIT

https://hdl.handle.net/1721.1/155675

Download:  PDF icon CBMM-Memo-148.pdf
CBMM Memo No:  148

Associated Module: 

CBMM Relationship: 

  • CBMM Funded