You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@singa.apache.org by "Xue Wanqi (JIRA)" <ji...@apache.org> on 2018/03/07 07:17:00 UTC
[jira] [Commented] (SINGA-342) Support autograd
[ https://issues.apache.org/jira/browse/SINGA-342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16389165#comment-16389165 ]
Xue Wanqi commented on SINGA-342:
---------------------------------
I am working on this!
> Support autograd
> -----------------
>
> Key: SINGA-342
> URL: https://issues.apache.org/jira/browse/SINGA-342
> Project: Singa
> Issue Type: New Feature
> Reporter: wangwei
> Priority: Major
>
> Autograd computes the partial derivatives of a complex function following chain rule (or back-propagation).
> To implement autograd, we can follow [https://stackoverflow.com/questions/32034237/how-does-numpys-transpose-method-permute-the-axes-of-an-array] and [https://github.com/HIPS/autograd.]
> In particular, we record the operation and operands of each result tensor during forward propagation. A graph is constructed based on the recorded information. Once the loss.backward() is triggered, we run backward propagation over the graph to compute the gradients of parameters.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)