You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/01/05 05:14:24 UTC
[GitHub] feevos opened a new issue #9316: bug in parameter default values of gluon.loss.Loss
feevos opened a new issue #9316: bug in parameter default values of gluon.loss.Loss
URL: https://github.com/apache/incubator-mxnet/issues/9316
## Description
(I think!) there is a bug in the default values of the parameters ```weight``` and ```batch_axis``` of the module ```gluon.loss.Loss```. In particular when I try to create a custom loss function derived from Loss I get the following error:
**implementation that produces error**
```Python
from mxnet.gluon.loss import Loss
class jaccard(Loss):
"""
Jaccard loss coefficient. Adopted from tensorlayer:
https://github.com/zsdonghao/tensorlayer/blob/master/tensorlayer/cost.py
INPUT:
tensor of size (Nbatch, Nclasses, W,H)
OUTPUT:
The average (over batch) of the average value (over classes). Can I do better?
TODO: Use a weight for each class? See example in tests/Notebooks/custom_loss.ipynb
"""
def __init__(self, _smooth=1.0e-3, _axis=[2,3], _weight = None, _batch_axis= 0, **kwards):
Loss.__init__(self, **kwards)
with self.name_scope():
self.smooth = _smooth
self.axis = _axis
def hybrid_forward(self,F,_label, _output):
itrs = F.sum(_label*_output, axis=self.axis)
l = F.sum(_label*_label, axis=self.axis)
r = F.sum(_output*_output, axis=self.axis)
IoU = (2.0*itrs+self.smooth)/(l+r+self.smooth)
meanIoU = F.mean(IoU,axis=1)
meanTot = F.mean(meanIoU)
return meanTot
jacc_loss = jaccard()
```
## Error Message:
```Python
TypeError Traceback (most recent call last)
<ipython-input-5-8b8b7a97ea46> in <module>()
----> 1 jacc_loss = jaccard()
<ipython-input-4-fed5e9fc1f68> in __init__(self, _smooth, _axis, _weight, _batch_axis, **kwards)
11
12 def __init__(self, _smooth=1.0e-3, _axis=[2,3], _weight = None, _batch_axis= 0, **kwards):
---> 13 Loss.__init__(self, **kwards)
14
15 with self.name_scope():
TypeError: __init__() takes exactly 3 arguments (1 given)
```
## What have you tried to solve it?
If I modify the constructor of Loss by providing the default values of the Loss (as described in the [documentation](https://mxnet.incubator.apache.org/api/python/gluon/loss.html) :
```Python
class jaccard(Loss):
"""
Jaccard loss coefficient. Adopted from tensorlayer:
https://github.com/zsdonghao/tensorlayer/blob/master/tensorlayer/cost.py
INPUT:
tensor of size (Nbatch, Nclasses, W,H)
OUTPUT:
The average (over batch) of the average value (over classes). Can I do better?
TODO: Use a weight for each class? See example in tests/Notebooks/custom_loss.ipynb
"""
def __init__(self, _smooth=1.0e-3, _axis=[2,3], _weight = None, _batch_axis= 0, **kwards):
Loss.__init__(self,weight=_weight, batch_axis = _batch_axis, **kwards)
with self.name_scope():
self.smooth = _smooth
self.axis = _axis
def hybrid_forward(self,F,_label, _output):
itrs = F.sum(_label*_output, axis=self.axis)
l = F.sum(_label*_label, axis=self.axis)
r = F.sum(_output*_output, axis=self.axis)
IoU = (2.0*itrs+self.smooth)/(l+r+self.smooth)
meanIoU = F.mean(IoU,axis=1)
meanTot = F.mean(meanIoU)
return meanTot
```
the problem is solved. I guess there is an inconsistency between documentation and implementation (there are no default values as described in the docs)? Hope this helps.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services