You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/04/22 23:23:51 UTC

[GitHub] [incubator-mxnet] leandrolcampos opened a new issue #18140: Reparameterization trick for Gamma distribution

leandrolcampos opened a new issue #18140:
URL: https://github.com/apache/incubator-mxnet/issues/18140


   ## Description
   I'd like to suggest the implementation of implicit reparameterization gradients, as described in the paper [1], for the Gamma distribution: ndarray.sample_gamma and symbol.sample_gamma. 
   
   This will allow this distribution and others that depend on it, like Beta, Dirichlet and Student t distributions, to be used as easily as the Normal distribution in stochastic computation graphs.
   
   Stochastic computation graphs are necessary for variational autoenecoders (VAEs), automatic variational inference, Bayesian learning in neural networks, and principled regularization in deep networks.
   
   The proposed approach in the paper [1] is the same used in the TensorFlow's method tf.random.gamma, as we can see in [2].
   
   Thanks for the opportunity to request this feature.
   
   ## References
   - [1] [Michael Figurnov, Shakir Mohamed, Andriy Mnih. Implicit Reparameterization Gradients, 2018](https://arxiv.org/pdf/1805.08498.pdf)
   - [2] [tf.random.gamma](https://www.tensorflow.org/api_docs/python/tf/random/gamma)
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] xidulu commented on issue #18140: Reparameterization trick for Gamma distribution

Posted by GitBox <gi...@apache.org>.
xidulu commented on issue #18140:
URL: https://github.com/apache/incubator-mxnet/issues/18140#issuecomment-618131555


   Currently, pathwise gradient is only implemented for `mx.np.random.{normal, gumbel, logistic, weibull, exponential, pareto}` in the backend.
   
   We are planning to implement (in C++ backend) implicit reparam grad for Gamma related distribution in the future, which is extremely useful, as you pointed out, in scenarios like `BBVI for LDA`.
   
   Another possible solution, is to wrap the sampling Op as a CustomOp, which allows you to manually define the backward computation with Python.
   https://mxnet.apache.org/api/python/docs/tutorials/extend/customop.html


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] leandrolcampos commented on issue #18140: Reparameterization trick for Gamma distribution

Posted by GitBox <gi...@apache.org>.
leandrolcampos commented on issue #18140:
URL: https://github.com/apache/incubator-mxnet/issues/18140#issuecomment-618403965


   Hi @xidulu,
   
   Thanks for your suggestion. I'll follow it. But, for performance reasons, I also look forward to your implementation (in C++ backend) of implicit reparam grad for Gamma related distribution.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] sxjscience commented on issue #18140: Reparameterization trick for Gamma distribution

Posted by GitBox <gi...@apache.org>.
sxjscience commented on issue #18140:
URL: https://github.com/apache/incubator-mxnet/issues/18140#issuecomment-618110265


   @xidulu @szhengac 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [incubator-mxnet] xidulu edited a comment on issue #18140: Reparameterization trick for Gamma distribution

Posted by GitBox <gi...@apache.org>.
xidulu edited a comment on issue #18140:
URL: https://github.com/apache/incubator-mxnet/issues/18140#issuecomment-618131555


   Hi @leandrolcampos
   
   Currently, pathwise gradient is only implemented for `mx.np.random.{normal, gumbel, logistic, weibull, exponential, pareto}` in the backend.
   
   We are planning to implement (in C++ backend) implicit reparam grad for Gamma related distribution in the future, which is extremely useful, as you pointed out, in scenarios like `BBVI for LDA`.
   
   Another possible solution, is to wrap the sampling Op as a CustomOp, which allows you to manually define the backward computation with Python.
   https://mxnet.apache.org/api/python/docs/tutorials/extend/customop.html


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org