You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/12/10 08:14:09 UTC

[GitHub] zhreshold commented on issue #13594: Autograd error when using custom parameters

zhreshold commented on issue #13594: Autograd error when using custom parameters
URL: https://github.com/apache/incubator-mxnet/issues/13594#issuecomment-445727359
 
 
   @seujung  read/write parameters using `.data()` and `set_data()` in computation graph, i.e. in `forward` or `hybrid_forward`  is not allowed becaue it will cause autograd not successfully recording the gradients. 
   
   You can use parameters directly in forward, gluon will automatically prepare them for you.
   
   Correct forward function is:
   ```
   def forward(self, x, scale, loc, logdet):
           _, _, height, width = x.shape
           # initialization is not allowed in forward
           #if not self.initialized:
           #    self.init(x)
           #    self.initialized = True
           
           log_abs = logabs(scale)
           
           logdet = height * width * nd.sum(log_abs)
           
           if self.use_logdet:
               return scale * (x + loc), logdet
           
           else:
               return scale * (x + loc),
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services