You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2020/07/13 22:13:16 UTC

[GitHub] [incubator-mxnet] leezu commented on a change in pull request #18691: Merge numpy.mxnet.io into mxnet official website

leezu commented on a change in pull request #18691:
URL: https://github.com/apache/incubator-mxnet/pull/18691#discussion_r453969147



##########
File path: docs/python_docs/python/tutorials/getting-started/crash-course/2-nn.md
##########
@@ -15,47 +15,50 @@
 <!--- specific language governing permissions and limitations -->
 <!--- under the License. -->
 
-# Create a neural network
+# Step 2: Create a neural network
 
-Now let's look how to create neural networks in Gluon. In addition the NDArray package (`nd`) that we just covered, we now will also import the neural network `nn` package from `gluon`.
+In this step, you learn how to use NP on MXNet to create neural networks in Gluon. In addition to the `np` package that you learned about in the previous step [Step 1: Manipulate data with NP on MXNet](1-ndarray.md), you also import the neural network `nn` package from `gluon`.
+
+Use the following commands to import the packages required for this step.
 
 ```{.python .input  n=2}
-from mxnet import nd
+from mxnet import np, npx
 from mxnet.gluon import nn
+npx.set_np()  # Change MXNet to the numpy-like mode.
 ```
 
 ## Create your neural network's first layer
 
-Let's start with a dense layer with 2 output units.
+Use the following code example to start with a dense layer with two output units.
 <!-- mention what the none and the linear parts mean? -->
 
 ```{.python .input  n=31}
 layer = nn.Dense(2)
 layer
 ```
 
-Then initialize its weights with the default initialization method, which draws random values uniformly from $[-0.7, 0.7]$.
+Initialize its weights with the default initialization method, which draws random values uniformly from $[-0.7, 0.7]$. You can see this in the following example.
 
 ```{.python .input  n=32}
 layer.initialize()
 ```
 
-Then we do a forward pass with random data. We create a $(3,4)$ shape random input `x` and feed into the layer to compute the output.
+Do a forward pass with random data, shown in the following example. We create a $(3,4)$ shape random input `x` and feed into the layer to compute the output.
 
 ```{.python .input  n=34}
-x = nd.random.uniform(-1,1,(3,4))
+x = np.random.uniform(-1,1,(3,4))
 layer(x)
 ```
 
-As can be seen, the layer's input limit of 2 produced a $(3,2)$ shape output from our $(3,4)$ input. Note that we didn't specify the input size of `layer` before (though we can specify it with the argument `in_units=4` here), the system will automatically infer it during the first time we feed in data, create and initialize the weights. So we can access the weight after the first forward pass:
+As can be seen, the layer's input limit of two produced a $(3,2)$ shape output from our $(3,4)$ input. You didn't specify the input size of `layer` before, though you can specify it with the argument `in_units=4` here. The system  automatically infers it during the first time you feed in data, create, and initialize the weights. You can access the weight after the first forward pass, as shown in this example.
 
 ```{.python .input  n=35}
-layer.weight.data()
+# layer.weight.data() # FIXME

Review comment:
       FIXME?

##########
File path: docs/python_docs/python/tutorials/getting-started/crash-course/1-ndarray.md
##########
@@ -15,113 +15,108 @@
 <!--- specific language governing permissions and limitations -->
 <!--- under the License. -->
 
-# Manipulate data with `ndarray`
+# Step 1: Manipulate data with NP on MXNet
 
-We'll start by introducing the `NDArray`, MXNet’s primary tool for storing and transforming data. If you’ve worked with `NumPy` before, you’ll notice that an NDArray is, by design, similar to NumPy’s multi-dimensional array.
+This getting started exercise introduces the `np` package, which is the primary tool for storing and
+transforming data on MXNet. If you’ve worked with NumPy before, you’ll notice `np` is, by design, similar to NumPy.
 
-## Get started
+## Import packages and create an array
 
-To get started, let's import the `ndarray` package (`nd` is a shorter alias) from MXNet.
 
-```{.python .input  n=1}
-# If you haven't installed MXNet yet, you can uncomment the following line to
-# install the latest stable release
-# !pip install -U mxnet
+To get started, run the following commands to import the `np` package together with the NumPy extensions package `npx`. Together, `np` with `npx` make up the NP on MXNet front end.
 
-from mxnet import nd
+```{.python .input  n=1}
+from mxnet import np, npx
+npx.set_np()  # Activate NumPy-like mode.

Review comment:
       Linking https://github.com/apache/incubator-mxnet/pull/18631 as the line needs removal after https://github.com/apache/incubator-mxnet/pull/18631 is merged

##########
File path: docs/python_docs/python/tutorials/getting-started/crash-course/1-ndarray.md
##########
@@ -15,113 +15,108 @@
 <!--- specific language governing permissions and limitations -->
 <!--- under the License. -->
 
-# Manipulate data with `ndarray`
+# Step 1: Manipulate data with NP on MXNet
 
-We'll start by introducing the `NDArray`, MXNet’s primary tool for storing and transforming data. If you’ve worked with `NumPy` before, you’ll notice that an NDArray is, by design, similar to NumPy’s multi-dimensional array.
+This getting started exercise introduces the `np` package, which is the primary tool for storing and

Review comment:
       Let's summarize the extent to which `np` is similar and link to a document containing details ` docs/python_docs/python/tutorials/getting-started/deepnumpy/deepnumpy-vs-numpy.md`




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org