You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/12/17 03:29:02 UTC

[GitHub] ascust opened a new issue #13660: A question about the mechanism of autograd

ascust opened a new issue #13660: A question about the mechanism of autograd
URL: https://github.com/apache/incubator-mxnet/issues/13660
 
 
   I am wondering how autograd works when I call backward twice.
   For example, I have a network and a loss and I do:
   ```
   with autograd.record(True):
       network_out = ....
       loss = criterion(network_out, gt)
       autograd.backwardI(loss)
   with autograd.record(True):
       network_out2 = ....
       loss = criterion(network_out2, gt2)
       autograd.backwardI(loss)
   
   optimizer.step()
   ```
   Since I did backward twice, does the second one overrides the first one in terms of the gradients? And actually we just update the network based on the second backward? If this is true, how do I accumulate the gradients and call "step()" to update the network altogether?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services