You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/12/11 03:40:06 UTC

[GitHub] [incubator-mxnet] Sundrops opened a new issue #19657: gradients using loss=F.make_loss(myloss) are not same as gradients not using loss=F.make_loss(myloss) in Hybrid programming

Sundrops opened a new issue #19657:
URL: https://github.com/apache/incubator-mxnet/issues/19657


   > This operator accepts a customized loss function symbol as a terminal loss and the symbol should be an operator with no backward dependency. The output of this function is the gradient of loss with respect to the input data.
   
   The description of ndarray.make_loss is same as the description of symbol.make_loss. And it only explains symbol, not ndarray. 
   I want to know what `F.make_ loss()` will do when I use `net.hybridize()` and `loss.backward()`.
   
   https://mxnet.apache.org/versions/1.7.0/api/python/docs/api/ndarray/ndarray.html?highlight=make_loss#mxnet.ndarray.make_loss
   https://mxnet.apache.org/versions/1.7.0/api/python/docs/api/symbol/symbol.html#mxnet.symbol.make_loss
   
   ```python
   class Myloss(mx.gluon.nn.HybridBlock):
       def __init__(self):
           super(Myloss, self).__init__()
       def hybrid_forward(self, F, pred, gt):
           loss_l2 = F.sum(F.square(pred - gt), axis=1) / 2
           return F.make_loss(loss_l2)
   net=resnet()
   net.hybridize()
   x = net(input)
   myloss = Myloss()
   loss = myloss(x, y)
   loss.backward()
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@mxnet.apache.org
For additional commands, e-mail: issues-help@mxnet.apache.org


[GitHub] [incubator-mxnet] szha commented on issue #19657: gradients using loss=F.make_loss(myloss) are not same as gradients not using loss=F.make_loss(myloss) in Hybrid programming

Posted by GitBox <gi...@apache.org>.
szha commented on issue #19657:
URL: https://github.com/apache/incubator-mxnet/issues/19657#issuecomment-775401348


   @Sundrops there's no need to use make_loss in Gluon as all values can be used as head gradients now.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@mxnet.apache.org
For additional commands, e-mail: issues-help@mxnet.apache.org


[GitHub] [incubator-mxnet] github-actions[bot] commented on issue #19657: gradients using loss=F.make_loss(myloss) are not same as gradients not using loss=F.make_loss(myloss) in Hybrid programming

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on issue #19657:
URL: https://github.com/apache/incubator-mxnet/issues/19657#issuecomment-742947261


   Welcome to Apache MXNet (incubating)! We are on a mission to democratize AI, and we are glad that you are contributing to it by opening this issue.
   Please make sure to include all the relevant context, and one of the @apache/mxnet-committers will be here shortly.
   If you are interested in contributing to our project, let us know! Also, be sure to check out our guide on [contributing to MXNet](https://mxnet.apache.org/community/contribute) and our [development guides wiki](https://cwiki.apache.org/confluence/display/MXNET/Developments).


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@mxnet.apache.org
For additional commands, e-mail: issues-help@mxnet.apache.org


[GitHub] [incubator-mxnet] szha closed issue #19657: gradients using loss=F.make_loss(myloss) are not same as gradients not using loss=F.make_loss(myloss) in Hybrid programming

Posted by GitBox <gi...@apache.org>.
szha closed issue #19657:
URL: https://github.com/apache/incubator-mxnet/issues/19657


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@mxnet.apache.org
For additional commands, e-mail: issues-help@mxnet.apache.org