You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2017/12/14 16:17:08 UTC

[GitHub] bradcar commented on issue #8912: add Gluon PReLU activation layer

bradcar commented on issue #8912: add Gluon PReLU activation layer
URL: https://github.com/apache/incubator-mxnet/pull/8912#issuecomment-351758419
 
 
   Open question: Since Gluon LeakyReLU is implemented with 
   <pre> return F.LeakyReLU(x, act_type='leaky', slope=self._alpha, name='fwd')</pre>
   and since there is a act_type='prelu' in  F.LeakyReLU, why not also implement Gluon PReLU with 
   <pre> return F.LeakyReLU(x, act_type='prelu', ...etc</pre>
   What would be more efficient?
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services