You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by sk...@apache.org on 2018/09/20 04:59:58 UTC

[incubator-mxnet] branch master updated: Updated Symbol tutorial with Gluon (#12190)

This is an automated email from the ASF dual-hosted git repository.

skm pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
     new 7797584  Updated Symbol tutorial with Gluon (#12190)
7797584 is described below

commit 7797584450186d36e52c902c3b606f4b4676e0a3
Author: Thom Lane <th...@gmail.com>
AuthorDate: Wed Sep 19 21:59:48 2018 -0700

    Updated Symbol tutorial with Gluon (#12190)
    
    * Corrections to profiling tutorial
    
    Corrected a race condition with stopping profiling. Added mx.nd.waitall to ensure all operations have completed, including GPU operations that might otherwise be missing.
    
    Also added alternative code for context selection GPU vs CPU, that had error before on machines with nvidia-smi.
    
    * Updated tutorial to include references to Gluon.
---
 docs/tutorials/basic/symbol.md | 8 ++++----
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git a/docs/tutorials/basic/symbol.md b/docs/tutorials/basic/symbol.md
index 6a4bb87..7ebcadf 100644
--- a/docs/tutorials/basic/symbol.md
+++ b/docs/tutorials/basic/symbol.md
@@ -5,10 +5,10 @@ the basic data structure for manipulating data in MXNet.
 And just using NDArray by itself, we can execute a wide range of mathematical operations.
 In fact, we could define and update a full neural network just by using `NDArray`.
 `NDArray` allows you to write programs for scientific computation
-in an imperative fashion, making full use of the native control of any front-end language.
+in an imperative fashion, making full use of the native control of any front-end language. Gluon uses this approach under the hood (before hybridization) to allow for flexible and debugable networks.
 So you might wonder, why don't we just use `NDArray` for all computation?
 
-MXNet provides the Symbol API, an interface for symbolic programming.
+MXNet also provides the Symbol API, an interface for symbolic programming.
 With symbolic programming, rather than executing operations step by step,
 we first define a *computation graph*.
 This graph contains placeholders for inputs and designated outputs.
@@ -16,7 +16,7 @@ We can then compile the graph, yielding a function
 that can be bound to `NDArray`s and run.
 MXNet's Symbol API is similar to the network configurations
 used by [Caffe](http://caffe.berkeleyvision.org/)
-and the symbolic programming in [Theano](http://deeplearning.net/software/theano/).
+and the symbolic programming in [Theano](http://deeplearning.net/software/theano/). And Gluon takes advantage of this approach under the hood after the network has been hybridized.
 
 Another advantage conferred by symbolic approach is that
 we can optimize our functions before using them.
@@ -291,7 +291,7 @@ One important difference of `Symbol` compared to `NDArray` is that we first
 declare the computation and then bind the computation with data to run.
 
 In this section, we introduce the functions to manipulate a symbol directly. But
-note that, most of them are wrapped by the `module` package.
+note that, most of them are wrapped by the high-level packages: `Module` and `Gluon`.
 
 ### Shape and Type Inference