You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by jx...@apache.org on 2017/12/13 03:11:57 UTC
[incubator-mxnet] branch master updated: update autograd.Function
docstring for usage (#9039)
This is an automated email from the ASF dual-hosted git repository.
jxie pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git
The following commit(s) were added to refs/heads/master by this push:
new 8f79651 update autograd.Function docstring for usage (#9039)
8f79651 is described below
commit 8f796516e2df9c33bb715e3c1f7cf289074fb1ab
Author: Sheng Zha <sz...@users.noreply.github.com>
AuthorDate: Tue Dec 12 19:11:53 2017 -0800
update autograd.Function docstring for usage (#9039)
* update autograd.Function docstring for usage
* Update autograd.py
---
python/mxnet/autograd.py | 14 +++++++++++++-
1 file changed, 13 insertions(+), 1 deletion(-)
diff --git a/python/mxnet/autograd.py b/python/mxnet/autograd.py
index 340a9e6..cc9cad8 100644
--- a/python/mxnet/autograd.py
+++ b/python/mxnet/autograd.py
@@ -372,7 +372,7 @@ class Function(object):
For example, a stable sigmoid function can be defined as::
- class sigmoid(Function):
+ class sigmoid(mx.autograd.Function):
def forward(self, x):
y = 1 / (1 + mx.nd.exp(-x))
self.save_for_backward(y)
@@ -383,6 +383,18 @@ class Function(object):
# and returns as many NDArrays as forward's arguments.
y, = self.saved_tensors
return y * (1-y)
+
+ Then, the function can be used in the following way::
+
+ func = sigmoid()
+ x = mx.nd.random.uniform(shape=(10,))
+ x.attach_grad()
+
+ with mx.autograd.record():
+ m = func(x)
+ m.backward()
+ dx = x.grad.asnumpy()
+
"""
_bwd_functype = CFUNCTYPE(c_int, c_int, c_int, POINTER(c_void_p),
POINTER(c_int), c_int, c_void_p)
--
To stop receiving notification emails like this one, please contact
['"commits@mxnet.apache.org" <co...@mxnet.apache.org>'].