You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2017/12/01 00:56:01 UTC

[GitHub] bradcar commented on issue #8559: How to use Prelu in gluon?

bradcar commented on issue #8559: How to use Prelu in gluon?
URL: https://github.com/apache/incubator-mxnet/issues/8559#issuecomment-348369626
 
 
   yes leaky ReLU isn't parameterized leaky relu.
   
   I would love to see a PLReLU in Gluon.
   
   All of this is made a bit more confusing since in MXNet, the mxnet.symbol.LeakyReLU
   https://mxnet.incubator.apache.org/api/python/symbol.html#mxnet.symbol.LeakyReLU
   takes a parameter act_type that can take on the implement several differet types of activation functuobs... ({'elu', 'leaky', 'prelu', 'rrelu'},optional, default='leaky').
   
       elu: Exponential Linear Unit. y = x > 0 ? x : slope * (exp(x)-1)
       leaky: Leaky ReLU. y = x > 0 ? x : slope * x
       prelu: Parametric ReLU. This is same as leaky except that slope is learnt during training.
       rrelu: Randomized ReLU. same as leaky but the slope is uniformly and randomly chosen from [lower_bound, upper_bound) for training, while fixed to be (lower_bound+upper_bound)/2 for inference.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services