You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by gi...@git.apache.org on 2017/08/01 11:56:28 UTC

[GitHub] cainiao2hao opened a new issue #7289: BlockGrad and weight decay on backpropagation

cainiao2hao opened a new issue #7289: BlockGrad and weight decay on backpropagation
URL: https://github.com/apache/incubator-mxnet/issues/7289
 
 
   I have a symbol for training like this. 
   ```python
   mx.sym.Group([loss0, loss1, mx.symbol.BlockGrad(output0)])
   ```
   The output0 will be used to calculate some accuracy metric. But it seems like 'BlockGrad' will also affect network weight because of the weight decay. Is there any way to ignore the output0 part when doing backpropagation but also use the output0 to evaluate the training result?
   @piiswrong @mli 
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services