|
pangstrom posted:This is a factoid from a previous life but you used to hear that backprop was biologically implausible except for maybe kinda-sorta in the cerebellum. You may be interested in this paper! https://arxiv.org/abs/2202.08587 quote:Using backpropagation to compute gradients of objective functions for optimization has remained a mainstay of machine learning. Backpropagation, or reverse-mode differentiation, is a special case within the general family of automatic differentiation algorithms that also includes the forward mode. We present a method to compute gradients based solely on the directional derivative that one can compute exactly and efficiently via the forward mode. We call this formulation the forward gradient, an unbiased estimate of the gradient that can be evaluated in a single forward run of the function, entirely eliminating the need for backpropagation in gradient descent. We demonstrate forward gradient descent in a range of problems, showing substantial savings in computation and enabling training up to twice as fast in some cases. tl;dr: it's possible to do a single-pass directional derivative and get an unbiased estimate of the gradient. I'm not sure if this makes it biologically plausible per se, but this at least gets rid of the bi-directional synapses.
|
# ¿ Jun 17, 2023 00:21 |
|
|
# ¿ May 17, 2024 17:43 |