You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/04/23 01:58:41 UTC

[GitHub] [incubator-mxnet] xidulu commented on issue #18140: Reparameterization trick for Gamma distribution

xidulu commented on issue #18140:
URL: https://github.com/apache/incubator-mxnet/issues/18140#issuecomment-618131555


   Currently, pathwise gradient is only implemented for `mx.np.random.{normal, gumbel, logistic, weibull, exponential, pareto}` in the backend.
   
   We are planning to implement (in C++ backend) implicit reparam grad for Gamma related distribution in the future, which is extremely useful, as you pointed out, in scenarios like `BBVI for LDA`.
   
   Another possible solution, is to wrap the sampling Op as a CustomOp, which allows you to manually define the backward computation with Python.
   https://mxnet.apache.org/api/python/docs/tutorials/extend/customop.html


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org