You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/10/31 18:49:32 UTC

[GitHub] gigasquid closed pull request #12974: Improve clojure tutorial

gigasquid closed pull request #12974: Improve clojure tutorial
URL: https://github.com/apache/incubator-mxnet/pull/12974
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/contrib/clojure-package/examples/tutorial/project.clj b/contrib/clojure-package/examples/tutorial/project.clj
index 4910886ca54..7f3254e641e 100644
--- a/contrib/clojure-package/examples/tutorial/project.clj
+++ b/contrib/clojure-package/examples/tutorial/project.clj
@@ -19,4 +19,7 @@
   :description "MXNET tutorials"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]])
+                 ;; Uncomment the one appropriate for your machine & configuration:
+                 #_[org.apache.mxnet.contrib.clojure/clojure-mxnet-linux-cpu "1.3.0"]
+                 #_[org.apache.mxnet.contrib.clojure/clojure-mxnet-linux-gpu "1.3.0"]
+                 #_[org.apache.mxnet.contrib.clojure/clojure-mxnet-osx-cpu "1.3.0"]])
diff --git a/contrib/clojure-package/examples/tutorial/src/tutorial/kvstore.clj b/contrib/clojure-package/examples/tutorial/src/tutorial/kvstore.clj
index 558b21f0aa4..f35d4a06922 100644
--- a/contrib/clojure-package/examples/tutorial/src/tutorial/kvstore.clj
+++ b/contrib/clojure-package/examples/tutorial/src/tutorial/kvstore.clj
@@ -16,35 +16,44 @@
 ;;
 
 (ns tutorial.kvstore
+  "A REPL tutorial of the MXNet Clojure API for KVStore, based on
+  https://mxnet.incubator.apache.org/api/clojure/kvstore.html"
   (:require [org.apache.clojure-mxnet.kvstore :as kvstore]
             [org.apache.clojure-mxnet.ndarray :as ndarray]
             [org.apache.clojure-mxnet.context :as context]))
 
-;;Basic Push and Pull
-;;Provides basic operation over multiple devices (GPUs or CPUs) on a single device.
 
-;; Initialization
-;; Let’s consider a simple example. It initializes a (int, NDArray) pair into the store, and then pulls the value out.
+;;;; Basic Push and Pull
 
-(def kv (kvstore/create "local")) ;; create a local kvstore
+;; Provides basic operation over multiple devices (GPUs or CPUs) on a
+;; single device.
+
+;;; Initialization
+;; Let’s consider a simple example. It initializes a (`int`,
+;; `NDArray`) pair into the store, and then pulls the value out.
+
+(def kv (kvstore/create "local")) ; create a local kvstore
 (def shape [2 3])
-;;; init the kvstore with a vector of keys (strings) and ndarrays
+;; init the kvstore with a vector of keys (strings) and ndarrays
 (kvstore/init kv ["3"] [(ndarray/* (ndarray/ones shape) 2)])
 (def a (ndarray/zeros shape))
 (kvstore/pull kv ["3"] [a])
 (ndarray/->vec a) ;=> [2.0 2.0 2.0 2.0 2.0 2.0]
 
 
-;;Push, Aggregation, and Updater
-;;For any key that’s been initialized, you can push a new value with the same shape to the key, as follows:
-
+;;; Push, Aggregation, and Updater
+;; For any key that’s been initialized, you can push a new value with
+;; the same shape to the key, as follows:
 (kvstore/push kv ["3"] [(ndarray/* (ndarray/ones shape) 8)])
 (kvstore/pull kv ["3"] [a])
 (ndarray/->vec a);=>[8.0 8.0 8.0 8.0 8.0 8.0]
 
-;;The data that you want to push can be stored on any device. Furthermore, you can push multiple values into the same key, where KVStore first sums all of these values, and then pushes the aggregated value, as follows:
+;; The data that you want to push can be stored on any
+;; device. Furthermore, you can push multiple values into the same
+;; key, where KVStore first sums all of these values, and then pushes
+;; the aggregated value, as follows:
 
-;; using multiple cpus instead of gpus
+;; (Here we use multiple CPUs.)
 (def cpus [(context/cpu 0) (context/cpu 1) (context/cpu 2)])
 (def b [(ndarray/ones shape {:ctx (nth cpus 0)})
         (ndarray/ones shape {:ctx (nth cpus 1)})
@@ -53,22 +62,33 @@
 (kvstore/pull kv "3" a)
 (ndarray/->vec a) ;=> [3.0 3.0 3.0 3.0 3.0 3.0]
 
-
-;;Pull
-;;You’ve already seen how to pull a single key-value pair. Similar to the way that you use the push command, you can pull the value into several devices with a single call.
+;;; Pull
+;; You’ve already seen how to pull a single key-value pair. Similar to
+;; the way that you use the push command, you can pull the value into
+;; several devices with a single call.
 (def b [(ndarray/ones shape {:ctx (context/cpu 0)})
         (ndarray/ones shape {:ctx (context/cpu 1)})])
 (kvstore/pull kv ["3" "3"] b)
 (map ndarray/->vec b) ;=> ([3.0 3.0 3.0 3.0 3.0 3.0] [3.0 3.0 3.0 3.0 3.0 3.0])
 
-;;List Key-Value Pairs
-;;All of the operations that we’ve discussed so far are performed on a single key. KVStore also provides the interface for generating a list of key-value pairs. For a single device, use the following:
+
+;;;; List Key-Value Pairs
+
+;; All of the operations that we’ve discussed so far are performed on
+;; a single key. KVStore also provides the interface for generating a
+;; list of key-value pairs. For a single device, use the following:
 
 (def ks ["5" "7" "9"])
-(kvstore/init kv ks [(ndarray/ones shape) (ndarray/ones shape) (ndarray/ones shape)])
-(kvstore/push kv ks [(ndarray/ones shape) (ndarray/ones shape) (ndarray/ones shape)])
-(def b [(ndarray/zeros shape) (ndarray/zeros shape) (ndarray/zeros shape)])
+(kvstore/init kv ks [(ndarray/ones shape)
+                     (ndarray/ones shape)
+                     (ndarray/ones shape)])
+(kvstore/push kv ks [(ndarray/ones shape)
+                     (ndarray/ones shape)
+                     (ndarray/ones shape)])
+(def b [(ndarray/zeros shape)
+        (ndarray/zeros shape)
+        (ndarray/zeros shape)])
 (kvstore/pull kv ks b)
-(map ndarray/->vec b);=> ([1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0])
+(map ndarray/->vec b) ;=> ([1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0] [1.0 1.0 1.0 1.0 1.0 1.0])
 
 
diff --git a/contrib/clojure-package/examples/tutorial/src/tutorial/module.clj b/contrib/clojure-package/examples/tutorial/src/tutorial/module.clj
index 3cef342f0ed..4ca50ff5cd4 100644
--- a/contrib/clojure-package/examples/tutorial/src/tutorial/module.clj
+++ b/contrib/clojure-package/examples/tutorial/src/tutorial/module.clj
@@ -16,6 +16,8 @@
 ;;
 
 (ns tutorial.module
+  "A REPL tutorial of the MXNet Clojure API for Module, based on
+  https://mxnet.incubator.apache.org/api/clojure/module.html"
   (:require [clojure.java.io :as io]
             [clojure.java.shell :refer [sh]]
             [org.apache.clojure-mxnet.eval-metric :as eval-metric]
@@ -24,12 +26,26 @@
             [org.apache.clojure-mxnet.symbol :as sym]
             [org.apache.clojure-mxnet.ndarray :as ndarray]))
 
+
+;; The Module API provides an intermediate and high-level interface
+;; for performing computation with neural networks in MXNet. Module
+;; wraps a Symbol and one or more Executors. It has both a high level
+;; and intermediate level API.
+
+
+;;;; Prepare the Data
+
+;; In this example, we are going to use the MNIST data set. If you
+;; start, we can run some helper scripts to download the data for us.
+
 (def data-dir "data/")
 
 (when-not (.exists (io/file (str data-dir "train-images-idx3-ubyte")))
   (sh "../../scripts/get_mnist_data.sh"))
 
-;;; Load the MNIST datasets
+;; MXNet provides function in the `io` namespace to load the MNIST
+;; datasets into training and test data iterators that we can use with
+;; our module.
 (def train-data (mx-io/mnist-iter {:image (str data-dir "train-images-idx3-ubyte")
                                    :label (str data-dir "train-labels-idx1-ubyte")
                                    :label-name "softmax_label"
@@ -47,11 +63,13 @@
                                   :flat true
                                   :silent false}))
 
-;; The module API provides an intermediate and high-level interface for performing computation with neural networks in MXNet.  Module wraps a Symbol and one or more Executors. It has both a high level and intermediate level api
 
-;; Preparing a module for Computation
+;;;; Preparing a module for Computation
 
-;; construct a module
+;; To construct a module, we need to have a symbol as input. This
+;; symbol takes input data in the first layer and then has subsequent
+;; layers of fully connected and relu activation layers, ending up in
+;; a softmax layer for output.
 
 (let [data (sym/variable "data")
       fc1 (sym/fully-connected "fc1" {:data data :num-hidden 128})
@@ -62,7 +80,7 @@
       out (sym/softmax-output "softmax" {:data fc3})]
   out) ;=>#object[org.apache.mxnet.Symbol 0x1f43a406 "org.apache.mxnet.Symbol@1f43a406"]
 
-;; You can also use as-> for easier threading
+;; You can also write this with the `as->` threading macro.
 
 
 (def out (as-> (sym/variable "data") data
@@ -75,10 +93,15 @@
 ;=> #'tutorial.module/out
 
 
-;; By default, context is the CPU. If you need data parallelization, you can specify a GPU context or an array of GPU contexts.
-;; like this (m/module out {:contexts [(context/gpu)]})
+;; By default, context is the CPU. If you need data parallelization,
+;; you can specify a GPU context or an array of GPU contexts, like
+;; this: `(m/module out {:contexts [(context/gpu)]})`
 
-;; Before you can compute with a module, you need to call `bind` to allocate the device memory and `initParams` or `set-params` to initialize the parameters. If you simply want to fit a module, you don’t need to call `bind` and `init-params` explicitly, because the `fit` function automatically calls them if they are needed.
+;; Before you can compute with a module, you need to call `bind` to
+;; allocate the device memory and `initParams` or `set-params` to
+;; initialize the parameters. If you simply want to fit a module, you
+;; don’t need to call `bind` and `init-params` explicitly, because the
+;; `fit` function automatically calls them if they are needed.
 
 (let [mod (m/module out)]
   (-> mod
@@ -86,29 +109,46 @@
                :label-shapes (mx-io/provide-label train-data)})
       (m/init-params)))
 
-;; Now you can compute with the module using functions like `forward`, `backward`, etc.
+;; Now you can compute with the module using functions like `forward`,
+;; `backward`, etc.
 
 
-;; Training, Predicting, and Evaluating
+;;;; Training and Predicting
 
-;;Modules provide high-level APIs for training, predicting, and evaluating. To fit a module, call the `fit` function with some DataIters:
+;; Modules provide high-level APIs for training, predicting, and
+;; evaluating. To fit a module, call the `fit` function with some data
+;; iterators:
 
-(def mod (m/fit (m/module out) {:train-data train-data :eval-data test-data :num-epoch 1}))
+(def mod
+  (m/fit (m/module out) {:train-data train-data
+                         :eval-data test-data
+                         :num-epoch 1}))
+;; =>
 ;; Epoch  0  Train- [accuracy 0.12521666]
 ;; Epoch  0  Time cost- 8392
 ;; Epoch  0  Validation-  [accuracy 0.2227]
 
 
-;; You can pass in batch-end callbacks using batch-end-callback and epoch-end callbacks using epoch-end-callback in the `fit-params`. You can also set parameters using functions like in the fit-params like optimizer and eval-metric. To learn more about the fit-params, see the fit-param function options. To predict with a module, call `predict` with a DataIter:
+;; You can pass in batch-end callbacks using batch-end-callback and
+;; epoch-end callbacks using epoch-end-callback in the
+;; `fit-params`. You can also set parameters using functions like in
+;; the fit-params like optimizer and eval-metric. To learn more about
+;; the fit-params, see the fit-param function options. To predict with
+;; a module, call `predict` with a DataIter:
+
+(def results
+  (m/predict mod {:eval-data test-data}))
 
-(def results (m/predict mod {:eval-data test-data}))
 (first results) ;=>#object[org.apache.mxnet.NDArray 0x3540b6d3 "org.apache.mxnet.NDArray@a48686ec"]
 
 (first (ndarray/->vec (first results))) ;=>0.08261358
 
-;;The module collects and returns all of the prediction results. For more details about the format of the return values, see the documentation for the `predict` function.
+;; The module collects and returns all of the prediction results. For
+;; more details about the format of the return values, see the
+;; documentation for the `predict` function.
 
-;;When prediction results might be too large to fit in memory, use the `predict-every-batch` API
+;; When prediction results might be too large to fit in memory, use
+;; the `predict-every-batch` API.
 
 (let [preds (m/predict-every-batch mod {:eval-data test-data})]
   (mx-io/reduce-batches test-data
@@ -118,23 +158,33 @@
                           ;;; do something
                           (inc i))))
 
-;;If you need to evaluate on a test set and don’t need the prediction output, call the `score` function with a DataIter and an EvalMetric:
+;; If you need to evaluate on a test set and don’t need the prediction
+;; output, call the `score` function with a data iterator and an eval
+;; metric:
 
-(m/score mod {:eval-data test-data :eval-metric (eval-metric/accuracy)}) ;=>["accuracy" 0.2227]
+(m/score mod {:eval-data test-data
+              :eval-metric (eval-metric/accuracy)}) ;=>["accuracy" 0.2227]
 
-;;This runs predictions on each batch in the provided DataIter and computes the evaluation score using the provided EvalMetric. The evaluation results are stored in metric so that you can query later.
+;; This runs predictions on each batch in the provided DataIter and
+;; computes the evaluation score using the provided EvalMetric. The
+;; evaluation results are stored in metric so that you can query
+;; later.
 
-;;Saving and Loading Module Parameters
 
-;;To save the module parameters in each training epoch, use a `checkpoint` function
 
+;;;; Saving and Loading
+
+;; To save the module parameters in each training epoch, use the
+;; `save-checkpoint` function:
 
 (let [save-prefix "my-model"]
   (doseq [epoch-num (range 3)]
     (mx-io/do-batches train-data (fn [batch
                                           ;; do something
-]))
-    (m/save-checkpoint mod {:prefix save-prefix :epoch epoch-num :save-opt-states true})))
+                                     ]))
+    (m/save-checkpoint mod {:prefix save-prefix
+                            :epoch epoch-num
+                            :save-opt-states true})))
 
 ;; INFO  org.apache.mxnet.module.Module: Saved checkpoint to my-model-0000.params
 ;; INFO  org.apache.mxnet.module.Module: Saved optimizer state to my-model-0000.states
@@ -144,20 +194,22 @@
 ;; INFO  org.apache.mxnet.module.Module: Saved optimizer state to my-model-0002.states
 
 
-;;To load the saved module parameters, call the `load-checkpoint` function:
+;; To load the saved module parameters, call the `load-checkpoint`
+;; function:
 
 (def new-mod (m/load-checkpoint {:prefix "my-model" :epoch 1 :load-optimizer-states true}))
 
 new-mod ;=> #object[org.apache.mxnet.module.Module 0x5304d0f4 "org.apache.mxnet.module.Module@5304d0f4"]
 
-;;To initialize parameters, Bind the symbols to construct executors first with bind function. Then, initialize the parameters and auxiliary states by calling `init-params` function.
-
+;; To initialize parameters, bind the symbols to construct executors
+;; first with the `bind` function. Then, initialize the parameters and
+;; auxiliary states by calling the `init-params` function.\
 (-> new-mod
-    (m/bind {:data-shapes (mx-io/provide-data train-data) :label-shapes (mx-io/provide-label train-data)})
+    (m/bind {:data-shapes (mx-io/provide-data train-data)
+             :label-shapes (mx-io/provide-label train-data)})
     (m/init-params))
 
-;;To get current parameters, use `params`
-
+;; To get current parameters, use `params`
 (let [[arg-params aux-params] (m/params new-mod)]
   {:arg-params arg-params
    :aux-params aux-params})
@@ -178,20 +230,24 @@ new-mod ;=> #object[org.apache.mxnet.module.Module 0x5304d0f4 "org.apache.mxnet.
 ;;  :aux-params {}}
 
 
-;;To assign parameter and aux state values, use `set-params` function.
+;; To assign parameter and aux state values, use the `set-params`
+;; function:
+(m/set-params new-mod {:arg-params (m/arg-params new-mod)
+                       :aux-params (m/aux-params new-mod)})
 
-(m/set-params new-mod {:arg-params (m/arg-params new-mod) :aux-params (m/aux-params new-mod)})
-;=> #object[org.apache.mxnet.module.Module 0x5304d0f4 "org.apache.mxnet.module.Module@5304d0f4"]
 
-;;To resume training from a saved checkpoint, instead of calling `set-params`, directly call `fit`, passing the loaded parameters, so that `fit` knows to start from those parameters instead of initializing randomly:
+;; To resume training from a saved checkpoint, pass the loaded
+;; parameters to the `fit` function. This will prevent `fit` from
+;; initializing randomly.
 
-;; reset the training data before calling fit or you will get an error
+;; (First, reset the training data before calling `fit` or you will
+;; get an error)
 (mx-io/reset train-data)
 (mx-io/reset test-data)
 
-(m/fit new-mod {:train-data train-data :eval-data test-data :num-epoch 2
-                :fit-params (-> (m/fit-params {:begin-epoch 1}))})
-
-;;Create fit-params, and then use it to set `begin-epoch` so that fit() knows to resume from a saved epoch.
-
-
+;; Create `fit-params` and then use it to set `begin-epoch` so that
+;; `fit` knows to resume from a saved epoch.
+(m/fit new-mod {:train-data train-data
+                :eval-data test-data
+                :num-epoch 2
+                :fit-params (m/fit-params {:begin-epoch 1})})
diff --git a/contrib/clojure-package/examples/tutorial/src/tutorial/ndarray.clj b/contrib/clojure-package/examples/tutorial/src/tutorial/ndarray.clj
index 858316eefdc..8e51de21515 100644
--- a/contrib/clojure-package/examples/tutorial/src/tutorial/ndarray.clj
+++ b/contrib/clojure-package/examples/tutorial/src/tutorial/ndarray.clj
@@ -16,42 +16,53 @@
 ;;
 
 (ns tutorial.ndarray
+  "A REPL tutorial of the MXNet Clojure API for NDArray, based on
+  https://mxnet.incubator.apache.org/api/clojure/ndarray.html"
   (:require [org.apache.clojure-mxnet.ndarray :as ndarray]
             [org.apache.clojure-mxnet.context :as context]))
 
-;;The NDArray package (mxnet.ndarray) contains tensor operations similar to numpy.ndarray. The syntax is also similar, except for some additional calls for dealing with I/O and multiple devices.
+;; The NDArray API contains tensor operations similar to
+;; `numpy.ndarray`. The syntax is also similar, except for some
+;; additional calls for dealing with I/O and multiple devices.
 
-;;Create NDArray
-;;Create mxnet.ndarray as follows:
 
-(def a (ndarray/zeros [100 50])) ;;all zero arrray of dimension 100 x 50
-(def b (ndarray/ones [256 32 128 1])) ;; all one array of dimension
-(def c (ndarray/array [1 2 3 4 5 6] [2 3])) ;; array with contents of a shape 2 x 3
+;;;; Create NDArray
 
-;;; There are also ways to convert to a vec or get the shape as an object or vec
+;; Create an MXNet NDArray as follows:
+(def a (ndarray/zeros [100 50]))            ; all-zero array of dimension 100 x 50
+(def b (ndarray/ones [256 32 128 1]))       ; all-one array of given dimensions
+(def c (ndarray/array [1 2 3 4 5 6] [2 3])) ; array with given contents in shape 2 x 3
+
+;;; There are also ways to convert an NDArray to a vec or to get the
+;;; shape as an object or vec:
 (ndarray/->vec c) ;=> [1.0 2.0 3.0 4.0 5.0 6.0]
 (ndarray/shape c) ;=> #object[org.apache.mxnet.Shape 0x583c865 "(2,3)"]
 (ndarray/shape-vec c) ;=> [2 3]
 
 
-;; NDArray Operations
 
-;; Arithmtic Operations
+;; There are some basic NDArray operations, like arithmetic and slice
+;; operations.
+
+
+;;;; NDArray Operations: Arithmetic
+
 (def a (ndarray/ones [1 5]))
 (def b (ndarray/ones [1 5]))
-(-> (ndarray/+ a b) (ndarray/->vec)) ;=>  [2.0 2.0 2.0 2.0 2.0]
+(ndarray/->vec (ndarray/+ a b)) ;=>  [2.0 2.0 2.0 2.0 2.0]
 
 ;; original ndarrays are unchanged
 (ndarray/->vec a) ;=> [1.0 1.0 1.0 1.0 1.0]
 (ndarray/->vec b) ;=> [1.0 1.0 1.0 1.0 1.0]
 
-;;inplace operators
+;; inplace operators
 (ndarray/+= a b)
 (ndarray/->vec a) ;=>  [2.0 2.0 2.0 2.0 2.0]
 
-;; other arthimetic operations are similar
+;; Other arthimetic operations are similar.
+
 
-;; Slice operations
+;;;; NDArray Operations: Slice
 
 (def a (ndarray/array [1 2 3 4 5 6] [3 2]))
 (def a1 (ndarray/slice a 1))
@@ -62,7 +73,8 @@
 (ndarray/shape-vec a2) ;=>[2 2]
 (ndarray/->vec a2) ;=> [3.0 4.0 5.0 6.0]
 
-;; Dot Product
+
+;;;; NDArray Operations: Dot Product
 
 (def arr1 (ndarray/array [1 2] [1 2]))
 (def arr2 (ndarray/array [3 4] [2 1]))
@@ -70,23 +82,40 @@
 (ndarray/shape-vec res) ;=> [1 1]
 (ndarray/->vec res) ;=> [11.0]
 
-;;Save and Load NDArray
-;;You can use MXNet functions to save and load a map of NDArrays from file systems, as follows:
+
+;;;; Save and Load NDArray
+
+;; You can use MXNet functions to save and load a map of NDArrays from
+;; file systems, as follows:
 
 (ndarray/save "filename" {"arr1" arr1 "arr2" arr2})
-;; you can also do "s3://path" or "hdfs"
+;; (you can also do "s3://path" or "hdfs")
+
+(ndarray/save "/Users/daveliepmann/src/coursework/mxnet-clj-tutorials/abc"
+              {"arr1" arr1 "arr2" arr2})
 
-;; to load
+;; To load:
 (def from-file (ndarray/load "filename"))
+
 from-file ;=>{"arr1" #object[org.apache.mxnet.NDArray 0x6115ba61 "org.apache.mxnet.NDArray@43d85753"], "arr2" #object[org.apache.mxnet.NDArray 0x374b5eff "org.apache.mxnet.NDArray@5c93def4"]}
 
-;;Multi-Device Support
+;; The good thing about using the `save` and `load` interface is that
+;; you can use the format across all `mxnet` language bindings. They
+;; also already support Amazon S3 and HDFS.
+
+
+;;;; Multi-Device Support
 
-;;Device information is stored in the mxnet.Context structure. When creating NDArray in MXNet, you can use the context argument (the default is the CPU context) to create arrays on specific devices as follows:
+;; Device information is stored in the `mxnet.Context` structure. When
+;; creating NDArray in MXNet, you can use the context argument (the
+;; default is the CPU context) to create arrays on specific devices as
+;; follows:
 
 (def cpu-a (ndarray/zeros [100 200]))
 (ndarray/context cpu-a) ;=> #object[org.apache.mxnet.Context 0x3f376123 "cpu(0)"]
 
 (def gpu-b (ndarray/zeros [100 200] {:ctx (context/gpu 0)})) ;; to use with gpu
 
-;;Currently, we do not allow operations among arrays from different contexts. To manually enable this, use the copyto  function to copy the content to different devices, and continue computation:
+;; Currently, we do not allow operations among arrays from different
+;; contexts. To manually enable this, use the `copy-to` function to
+;; copy the content to different devices, and continue computation.
diff --git a/contrib/clojure-package/examples/tutorial/src/tutorial/symbol.clj b/contrib/clojure-package/examples/tutorial/src/tutorial/symbol.clj
index bec71dee81f..ebf4f7e9679 100644
--- a/contrib/clojure-package/examples/tutorial/src/tutorial/symbol.clj
+++ b/contrib/clojure-package/examples/tutorial/src/tutorial/symbol.clj
@@ -16,79 +16,66 @@
 ;;
 
 (ns tutorial.symbol
+  "A REPL tutorial of the MXNet Clojure Symbolic API, based on
+  https://mxnet.incubator.apache.org/api/clojure/symbol.html"
   (:require [org.apache.clojure-mxnet.executor :as executor]
             [org.apache.clojure-mxnet.ndarray :as ndarray]
             [org.apache.clojure-mxnet.symbol :as sym]
             [org.apache.clojure-mxnet.context :as context]))
 
-;; How to compose symbols
-;;The symbolic API provides a way to configure computation graphs. You can configure the graphs either at the level of neural network layer operations or as fine-grained operations.
 
-;;The following example configures a two-layer neural network.
+;;;; How to Compose Symbols
 
+;; The symbolic API provides a way to configure computation
+;; graphs. You can configure the graphs either at the level of neural
+;; network layer operations or as fine-grained operations.
+
+;; The following example configures a two-layer neural network.
 (def data (sym/variable "data"))
 (def fc1 (sym/fully-connected "fc1" {:data data :num-hidden 128}))
 (def act1 (sym/activation "act1" {:data fc1 :act-type "relu"}))
 (def fc2 (sym/fully-connected "fc2" {:data act1 :num-hidden 64}))
 (def net (sym/softmax-output "out" {:data fc2}))
 
-;; you could also combine this more dynamically with
+;; This can also be combined more dynamically with the `as->` Clojure
+;; threading form.
 (as-> (sym/variable "data") data
   (sym/fully-connected "fc1" {:data data :num-hidden 128})
-  (sym/activation "act1" {:data data :act-type "relu"})
+  (sym/activation "act1"     {:data data :act-type "relu"})
   (sym/fully-connected "fc2" {:data data :num-hidden 64})
-  (sym/softmax-output "out" {:data data}))
+  (sym/softmax-output "out"  {:data data}))
 
 net ;=> #object[org.apache.mxnet.Symbol 0x5c78c8c2 "org.apache.mxnet.Symbol@5c78c8c2"] 
 
-
-;;The basic arithmetic operators (plus, minus, div, multiplication)
-
-;;The following example creates a computation graph that adds two inputs together.
+;; The basic arithmetic operators (plus, minus, div, multiplication)
+;; work as expected. The following example creates a computation graph
+;; that adds two inputs together.
 
 (def a (sym/variable "a"))
 (def b (sym/variable "b"))
 (def c (sym/+ a b))
 
-;; Each symbol takes a (unique) string name. NDArray and Symbol both represent a single tensor. Operators represent the computation between tensors. Operators take symbol (or NDArray) as inputs and might also additionally accept other hyperparameters such as the number of hidden neurons (num_hidden) or the activation type (act_type) and produce the output.
-
-;; We can view a symbol simply as a function taking several arguments. And we can retrieve those arguments with the following method call:
-
-;;We can view a symbol simply as a function taking several arguments. And we can retrieve those arguments with the following method call:
-
-(sym/list-arguments net)
-                                        ;=> ["data" "fc1_weight" "fc1_bias" "fc2_weight" "fc2_bias" "out_label"]
-
-;; These arguments are the parameters and inputs needed by each symbol:
-
-;; data: Input data needed by the variable data.
-;; fc1_weight and fc1_bias: The weight and bias for the first fully connected layer fc1.
-;; fc2_weight and fc2_bias: The weight and bias for the second fully connected layer fc2.
-;; out_label: The label needed by the loss.
-
-;;We can also specify the names explicitly:
-(def net (sym/variable "data"))
-(def w (sym/variable "myweight"))
-(def net (sym/fully-connected "fc1" {:data net :weight w :num-hidden 128}))
 
-(sym/list-arguments net)
-                                        ;=> ["data" "fc1_weight" "fc1_bias" "fc2_weight" "fc2_bias" "out_label" "myweight" "fc1_bias"]
+;;;; More Complicated Compositions
 
-
-;;In the above example, FullyConnected layer has 3 inputs: data, weight, bias. When any input is not specified, a variable will be automatically generated for it.
-
-
-;; More complicated composition
-
-;;MXNet provides well-optimized symbols for layers commonly used in deep learning (see src/operator). We can also define new operators in Python. The following example first performs an element-wise add between two symbols, then feeds them to the fully connected operator:
+;; MXNet provides well-optimized symbols for layers commonly used in
+;; deep learning (see src/operator). We can also define new operators
+;; in Python. The following example first performs an element-wise add
+;; between two symbols, then feeds them to the fully connected
+;; operator:
 
 (def lhs (sym/variable "data1"))
 (def rhs (sym/variable "data2"))
-(def net (sym/fully-connected "fc1" {:data (sym/+ lhs rhs) :num-hidden 128}))
+(def net (sym/fully-connected "fc1" {:data (sym/+ lhs rhs)
+                                     :num-hidden 128}))
 (sym/list-arguments net) ;=> ["data1" "data2" "fc1_weight" "fc1_bias"]
 
-;; Group Multiple Symbols
-;;To construct neural networks with multiple loss layers, we can use mxnet.sym.Group to group multiple symbols together. The following example groups two outputs:
+
+;;;; Group Multiple Symbols
+
+;; To construct neural networks with multiple loss layers, we can use
+;; `group` to group multiple symbols together. The following example
+;; groups two outputs:
 
 (def net (sym/variable "data"))
 (def fc1 (sym/fully-connected {:data net :num-hidden 128}))
@@ -96,56 +83,49 @@ net ;=> #object[org.apache.mxnet.Symbol 0x5c78c8c2 "org.apache.mxnet.Symbol@5c78
 (def out1 (sym/softmax-output {:data net2}))
 (def out2 (sym/linear-regression-output {:data net2}))
 (def group (sym/group [out1 out2]))
-(sym/list-outputs group);=> ["softmaxoutput0_output" "linearregressionoutput0_output"]
+(sym/list-outputs group) ;=> ["softmaxoutput0_output" "linearregressionoutput0_output"]
 
 
-;; Symbol Manipulation
-;; One important difference of Symbol compared to NDArray is that we first declare the computation and then bind the computation with data to run.
+;;;; Serialization
 
-;; In this section, we introduce the functions to manipulate a symbol directly. But note that, most of them are wrapped by the module package.
+;; You can use the `save` and `load` functions to serialize Symbol
+;; objects as JSON. These functions have the advantage of being
+;; language-agnostic and cloud-friendly. You can also get a JSON
+;; string directly using `to-json`.
 
-;; Shape and Type Inference
-;; For each symbol, we can query its arguments, auxiliary states and outputs. We can also infer the output shape and type of the symbol given the known input shape or type of some arguments, which facilitates memory allocation.
-(sym/list-arguments fc1) ;=> ["data" "fullyconnected1_weight" "fullyconnected1_bias"]
-(sym/list-outputs fc1) ;=> ["fullyconnected1_output"]
+;; The following example shows how to save a symbol to a file, load it
+;; back, and compare two symbols using a JSON string. You can also
+;; save to S3 as well.
 
-;; infer the  shapes given the shape of the input arguments
-(let [[arg-shapes out-shapes] (sym/infer-shape fc1 {:data [2 1]})]
-  {:arg-shapes arg-shapes
-   :out-shapes out-shapes}) ;=> {:arg-shapes ([2 1] [128 1] [128]), :out-shapes ([2 128])}
+(def a (sym/variable "a"))
+(def b (sym/variable "b"))
+(def c (sym/+ a b))
+(sym/save c "symbol-c.json")
+(def c2 (sym/load "symbol-c.json"))
+(= (sym/to-json c) (sym/to-json c2)) ;=>true
 
-;; Bind with Data and Evaluate
-;; The symbol c constructed above declares what computation should be run. To evaluate it, we first need to feed the arguments, namely free variables, with data.
 
-;; We can do it by using the bind method, which accepts device context and a dict mapping free variable names to NDArrays as arguments and returns an executor. The executor provides forward method for evaluation and an attribute outputs to get all the results.
+;;;; Executing Symbols
+
+;; To execute symbols, first we need to define the data that they
+;; should run on. We can do this with the `bind` function, which
+;; returns an executor. We then use `forward` to evaluate and
+;; `outputs` to get the results.
 
 (def a (sym/variable "a"))
 (def b (sym/variable "b"))
 (def c (sym/+ a b))
 
-(def ex (sym/bind c {"a" (ndarray/ones [2 2]) "b" (ndarray/ones [2 2])}))
+(def ex
+  (sym/bind c {"a" (ndarray/ones [2 2])
+               "b" (ndarray/ones [2 2])}))
+
 (-> (executor/forward ex)
     (executor/outputs)
     (first)
     (ndarray/->vec));=>  [2.0 2.0 2.0 2.0]
 
-;;We can evaluate the same symbol on GPU with different data.
-;; To do this you must have the correct native library jar defined as a dependency
-
-;;Note In order to execute the following section on a cpu set gpu_device to (cpu).
-
-
-(def ex (sym/bind c (context/gpu 0) {"a" (ndarray/ones [2 2]) "b" (ndarray/ones [2 2])}))
-
-;; Serialization
-;; There are two ways to save and load the symbols. You can use the mxnet.Symbol.save and mxnet.Symbol.load functions to serialize the Symbol objects. The advantage of using save and load functions is that it is language agnostic and cloud friendly. The symbol is saved in JSON format. You can also get a JSON string directly using mxnet.Symbol.toJson. Refer to API documentation for more details.
-
-;; The following example shows how to save a symbol to a file, load it back, and compare two symbols using a JSON string. You can also save to S3 as well
-
-(def a (sym/variable "a"))
-(def b (sym/variable "b"))
-(def c (sym/+ a b))
-(sym/save c "symbol-c.json")
-(def c2 (sym/load "symbol-c.json"))
-(= (sym/to-json c) (sym/to-json c2)) ;=>true
-
+;; We can evaluate the same symbol on GPU with different data.
+;; (To do this you must have the correct native library jar defined as a dependency.)
+(def ex (sym/bind c (context/gpu 0) {"a" (ndarray/ones [2 2])
+                                     "b" (ndarray/ones [2 2])}))
diff --git a/docs/api/clojure/ndarray.md b/docs/api/clojure/ndarray.md
index 814df8b2c6c..b0e5c991f7d 100644
--- a/docs/api/clojure/ndarray.md
+++ b/docs/api/clojure/ndarray.md
@@ -91,8 +91,9 @@ You can use MXNet functions to save and load a list or dictionary of NDArrays fr
 ```clojure
 (ndarray/save "filename" {"arr1" arr1 "arr2" arr2})
 ;; you can also do "s3://path" or "hdfs"
+```
 
-To load
+To load:
 
 ```clojure
 (def from-file (ndarray/load "filename"))
diff --git a/docs/api/clojure/symbol.md b/docs/api/clojure/symbol.md
index 85e1977362e..1ec841f153c 100644
--- a/docs/api/clojure/symbol.md
+++ b/docs/api/clojure/symbol.md
@@ -7,7 +7,6 @@ Topics:
 * [Group Multiple Symbols](#group-multiple-symbols)
 * [Serialization](#serialization)
 * [Executing Symbols](#executing-symbols)
-* [Multiple Outputs](#multiple-outputs)
 * [Symbol API Reference](http://mxnet.incubator.apache.org/api/clojure/docs/org.apache.clojure-mxnet.symbol.html)
 
 
@@ -128,23 +127,6 @@ _To do this you must have the correct native library jar defined as a dependency
 (def ex (sym/bind c (context/gpu 0) {"a" (ndarray/ones [2 2]) "b" (ndarray/ones [2 2])}))
 ```
 
-## Multiple Outputs
-
-To construct neural networks with multiple loss layers, we can use mxnet.sym.Group to group multiple symbols together. The following example groups two outputs:
-
-```clojure
-(def net (sym/variable "data"))
-(def fc1 (sym/fully-connected {:data net :num-hidden 128}))
-(def net2 (sym/activation {:data fc1 :act-type "relu"}))
-(def out1 (sym/softmax-output {:data net2}))
-(def out2 (sym/linear-regression-output {:data net2}))
-(def group (sym/group [out1 out2]))
-(sym/list-outputs group);=> ["softmaxoutput0_output" "linearregressionoutput0_output"]
-```
-
-After you get the ```group```, you can bind on ```group``` instead.
-The resulting executor will have two outputs, one for `linerarregressionoutput_output` and one for `softmax_output`.
-
 ## Next Steps
 * See [NDArray API](ndarray.md) for vector/matrix/tensor operations.
 * See [KVStore API](kvstore.md) for multi-GPU and multi-host distributed training.


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services