Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
fritz
Jul 26, 2003

Boris Galerkin posted:

I'm just not really sure how this is any different from newton's method or like basically any numerical method that minimizes an objective/loss function.

Newton's method and other second-order techniques aren't really appropriate for NNs for a couple reasons, not least of which being that the Hessian is so enormous: even if you don't store it you still have to compute with it.

Backpropagation is just a way of quickly computing the gradient of the objective function that works with the particular structure of NNs.

Adbot
ADBOT LOVES YOU

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply