You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/09/20 04:59:50 UTC

[GitHub] sandeep-krishnamurthy closed pull request #12190: Updated Symbol tutorial with Gluon

sandeep-krishnamurthy closed pull request #12190: Updated Symbol tutorial with Gluon
URL: https://github.com/apache/incubator-mxnet/pull/12190
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/docs/tutorials/basic/symbol.md b/docs/tutorials/basic/symbol.md
index 6a4bb872d04..7ebcadfc16f 100644
--- a/docs/tutorials/basic/symbol.md
+++ b/docs/tutorials/basic/symbol.md
@@ -5,10 +5,10 @@ the basic data structure for manipulating data in MXNet.
 And just using NDArray by itself, we can execute a wide range of mathematical operations.
 In fact, we could define and update a full neural network just by using `NDArray`.
 `NDArray` allows you to write programs for scientific computation
-in an imperative fashion, making full use of the native control of any front-end language.
+in an imperative fashion, making full use of the native control of any front-end language. Gluon uses this approach under the hood (before hybridization) to allow for flexible and debugable networks.
 So you might wonder, why don't we just use `NDArray` for all computation?
 
-MXNet provides the Symbol API, an interface for symbolic programming.
+MXNet also provides the Symbol API, an interface for symbolic programming.
 With symbolic programming, rather than executing operations step by step,
 we first define a *computation graph*.
 This graph contains placeholders for inputs and designated outputs.
@@ -16,7 +16,7 @@ We can then compile the graph, yielding a function
 that can be bound to `NDArray`s and run.
 MXNet's Symbol API is similar to the network configurations
 used by [Caffe](http://caffe.berkeleyvision.org/)
-and the symbolic programming in [Theano](http://deeplearning.net/software/theano/).
+and the symbolic programming in [Theano](http://deeplearning.net/software/theano/). And Gluon takes advantage of this approach under the hood after the network has been hybridized.
 
 Another advantage conferred by symbolic approach is that
 we can optimize our functions before using them.
@@ -291,7 +291,7 @@ One important difference of `Symbol` compared to `NDArray` is that we first
 declare the computation and then bind the computation with data to run.
 
 In this section, we introduce the functions to manipulate a symbol directly. But
-note that, most of them are wrapped by the `module` package.
+note that, most of them are wrapped by the high-level packages: `Module` and `Gluon`.
 
 ### Shape and Type Inference
 


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services