You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by wk...@apache.org on 2019/07/13 00:26:42 UTC

[incubator-mxnet] branch master updated: Docs: Fix misprints (#15505)

This is an automated email from the ASF dual-hosted git repository.

wkcn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
     new cbb6f7f  Docs: Fix misprints (#15505)
cbb6f7f is described below

commit cbb6f7fd6e297c17fd267b29174a4ed29100c757
Author: Ruslan Baratov <ru...@yahoo.com>
AuthorDate: Sat Jul 13 03:26:01 2019 +0300

    Docs: Fix misprints (#15505)
    
    * Docs: Fix 'bahavior' -> 'behavior'
    
    * Docs: Fix 'the the' -> 'the'
    
    * retrigger CI
    
    * retrigger CI
---
 NEWS.md                                                               | 4 ++--
 R-package/R/viz.graph.R                                               | 4 ++--
 contrib/clojure-package/README.md                                     | 2 +-
 docs/api/python/gluon/gluon.md                                        | 2 +-
 docs/install/windows_setup.md                                         | 2 +-
 docs/tutorials/mkldnn/MKLDNN_README.md                                | 2 +-
 example/gan/CGAN_mnist_R/README.md                                    | 2 +-
 include/mxnet/ndarray.h                                               | 2 +-
 perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm                           | 2 +-
 perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm                     | 2 +-
 python/mxnet/contrib/onnx/onnx2mx/_op_translations.py                 | 2 +-
 python/mxnet/gluon/data/dataloader.py                                 | 2 +-
 python/mxnet/module/base_module.py                                    | 2 +-
 python/mxnet/module/python_module.py                                  | 2 +-
 .../core/src/main/scala/org/apache/mxnet/module/BaseModule.scala      | 2 +-
 .../org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md   | 2 +-
 .../java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md   | 2 +-
 .../scala/org/apache/mxnetexamples/infer/objectdetector/README.md     | 2 +-
 src/operator/tensor/diag_op-inl.h                                     | 2 +-
 src/operator/tensor/matrix_op.cc                                      | 2 +-
 tools/staticbuild/README.md                                           | 4 ++--
 21 files changed, 24 insertions(+), 24 deletions(-)

diff --git a/NEWS.md b/NEWS.md
index 59f8de8..ee8a73c 100644
--- a/NEWS.md
+++ b/NEWS.md
@@ -678,8 +678,8 @@ This fixes an buffer overflow detected by ASAN.
   This PR adds or updates the docs for the infer_range feature.
 
   Clarifies the param in the C op docs
-  Clarifies the param in the the Scala symbol docs
-  Adds the param for the the Scala ndarray docs
+  Clarifies the param in the Scala symbol docs
+  Adds the param for the Scala ndarray docs
   Adds the param for the Python symbol docs
   Adds the param for the Python ndarray docs
 
diff --git a/R-package/R/viz.graph.R b/R-package/R/viz.graph.R
index 5804372..ab876af 100644
--- a/R-package/R/viz.graph.R
+++ b/R-package/R/viz.graph.R
@@ -34,7 +34,7 @@
 #' @param symbol a \code{string} representing the symbol of a model.
 #' @param shape a \code{numeric} representing the input dimensions to the symbol.
 #' @param direction a \code{string} representing the direction of the graph, either TD or LR.
-#' @param type a \code{string} representing the rendering engine of the the graph, either graph or vis.
+#' @param type a \code{string} representing the rendering engine of the graph, either graph or vis.
 #' @param graph.width.px a \code{numeric} representing the size (width) of the graph. In pixels
 #' @param graph.height.px a \code{numeric} representing the size (height) of the graph. In pixels
 #'
@@ -169,4 +169,4 @@ graph.viz <- function(symbol, shape=NULL, direction="TD", type="graph", graph.wi
   return(graph_render)
 }
 
-globalVariables(c("color", "shape", "label", "id", ".", "op"))
\ No newline at end of file
+globalVariables(c("color", "shape", "label", "id", ".", "op"))
diff --git a/contrib/clojure-package/README.md b/contrib/clojure-package/README.md
index 7566ade..7bb417e 100644
--- a/contrib/clojure-package/README.md
+++ b/contrib/clojure-package/README.md
@@ -237,7 +237,7 @@ If you are having trouble getting started or have a question, feel free to reach
 There are quite a few examples in the examples directory. To use.
 
 `lein install` in the main project
-`cd` in the the example project of interest
+`cd` in the example project of interest
 
 There are README is every directory outlining instructions.
 
diff --git a/docs/api/python/gluon/gluon.md b/docs/api/python/gluon/gluon.md
index c063a71..19e462e 100644
--- a/docs/api/python/gluon/gluon.md
+++ b/docs/api/python/gluon/gluon.md
@@ -28,7 +28,7 @@
 
 The Gluon package is a high-level interface for MXNet designed to be easy to use, while keeping most of the flexibility of a low level API. Gluon supports both imperative and symbolic programming, making it easy to train complex models imperatively in Python and then deploy with a symbolic graph in C++ and Scala.
 
-Based on the the [Gluon API specification](https://github.com/gluon-api/gluon-api), the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning. It makes it easy to prototype, build, and train deep learning models without sacrificing training speed.
+Based on the [Gluon API specification](https://github.com/gluon-api/gluon-api), the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning. It makes it easy to prototype, build, and train deep learning models without sacrificing training speed.
 
 **Advantages**
 
diff --git a/docs/install/windows_setup.md b/docs/install/windows_setup.md
index 4fb4e56..8a3b1f3 100644
--- a/docs/install/windows_setup.md
+++ b/docs/install/windows_setup.md
@@ -183,7 +183,7 @@ cd C:\incubator-mxnet\build
 cmake -G "Visual Studio 15 2017 Win64" -T cuda=9.2,host=x64 -DUSE_CUDA=1 -DUSE_CUDNN=1 -DUSE_NVRTC=1 -DUSE_OPENCV=1 -DUSE_OPENMP=1 -DUSE_BLAS=open -DUSE_LAPACK=1 -DUSE_DIST_KVSTORE=0 -DCUDA_ARCH_LIST=Common -DCUDA_TOOLSET=9.2 -DCUDNN_INCLUDE=C:\cuda\include -DCUDNN_LIBRARY=C:\cuda\lib\x64\cudnn.lib "C:\incubator-mxnet"
 ```
 * Make sure you set the environment variables correctly (OpenBLAS_HOME, OpenCV_DIR) and change the version of the Visual studio 2017 to v14.11 before enter above command.
-6. After the CMake successfully completed, compile the the MXNet source code by using following command:
+6. After the CMake successfully completed, compile the MXNet source code by using following command:
 ```
 msbuild mxnet.sln /p:Configuration=Release;Platform=x64 /maxcpucount
 ```
diff --git a/docs/tutorials/mkldnn/MKLDNN_README.md b/docs/tutorials/mkldnn/MKLDNN_README.md
index 516b2b3..c9e940f 100644
--- a/docs/tutorials/mkldnn/MKLDNN_README.md
+++ b/docs/tutorials/mkldnn/MKLDNN_README.md
@@ -135,7 +135,7 @@ command:
 >"C:\Program Files (x86)\IntelSWTools\compilers_and_libraries\windows\mkl\bin\mklvars.bat" intel64
 >cmake -G "Visual Studio 14 Win64" .. -DUSE_CUDA=0 -DUSE_CUDNN=0 -DUSE_NVRTC=0 -DUSE_OPENCV=1 -DUSE_OPENMP=1 -DUSE_PROFILER=1 -DUSE_BLAS=mkl -DUSE_LAPACK=1 -DUSE_DIST_KVSTORE=0 -DCUDA_ARCH_NAME=All -DUSE_MKLDNN=1 -DCMAKE_BUILD_TYPE=Release -DMKL_ROOT="C:\Program Files (x86)\IntelSWTools\compilers_and_libraries\windows\mkl" 
 ```
-4. After the CMake successfully completed, in Visual Studio, open the solution file ```.sln``` and compile it, or compile the the MXNet source code by using following command:
+4. After the CMake successfully completed, in Visual Studio, open the solution file ```.sln``` and compile it, or compile the MXNet source code by using following command:
 ```r
 msbuild mxnet.sln /p:Configuration=Release;Platform=x64 /maxcpucount
 ```
diff --git a/example/gan/CGAN_mnist_R/README.md b/example/gan/CGAN_mnist_R/README.md
index bf0bb08..99d2e1c 100644
--- a/example/gan/CGAN_mnist_R/README.md
+++ b/example/gan/CGAN_mnist_R/README.md
@@ -94,7 +94,7 @@ update_args_D<- updater_D(weight = exec_D$ref.arg.arrays, grad = exec_D$ref.grad
 mx.exec.update.arg.arrays(exec_D, update_args_D, skip.null=TRUE)
 ```
 
-The generator loss comes from the backpropagation of the the discriminator loss into its generated output. By faking the generator labels to be real samples into the discriminator, the discriminator back-propagated loss provides the generator with the information on how to best adapt its parameters to trick the discriminator into believing the fake samples are real.
+The generator loss comes from the backpropagation of the discriminator loss into its generated output. By faking the generator labels to be real samples into the discriminator, the discriminator back-propagated loss provides the generator with the information on how to best adapt its parameters to trick the discriminator into believing the fake samples are real.
 
 This requires to backpropagate the gradients up to the input data of the discriminator (whereas this input gradient is typically ignored in vanilla feedforward network).
 
diff --git a/include/mxnet/ndarray.h b/include/mxnet/ndarray.h
index 34e891e..428245b 100644
--- a/include/mxnet/ndarray.h
+++ b/include/mxnet/ndarray.h
@@ -197,7 +197,7 @@ class NDArray {
   }
   /*
    * This indicates whether an array is a view of another array (created by
-   * reshape or slice). If an array is a view and the the data is stored in
+   * reshape or slice). If an array is a view and the data is stored in
    * MKLDNN format, we need to convert the data to the default format when
    * data in the view is accessed.
    */
diff --git a/perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm b/perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm
index 657be74..1badacb 100644
--- a/perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm
+++ b/perl-package/AI-MXNet/lib/AI/MXNet/Gluon.pm
@@ -41,7 +41,7 @@ sub model_zoo { require AI::MXNet::Gluon::ModelZoo; 'AI::MXNet::Gluon::ModelZoo'
     AI::MXNet::Gluon supports both imperative and symbolic programming,
     making it easy to train complex models imperatively in Perl.
 
-    Based on the the Gluon API specification,
+    Based on the Gluon API specification,
     the Gluon API in Apache MXNet provides a clear, concise, and simple API for deep learning.
     It makes it easy to prototype, build, and train deep learning models without sacrificing training speed.
 
diff --git a/perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm b/perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm
index 542cf49..6b572f4 100644
--- a/perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm
+++ b/perl-package/AI-MXNet/lib/AI/MXNet/Module/Base.pm
@@ -684,7 +684,7 @@ method output_shapes() { confess("NotImplemented") }
 
 =head2 get_params
 
-    The parameters, these are potentially a copies of the the actual parameters used
+    The parameters, these are potentially a copies of the actual parameters used
     to do computation on the device.
 
     Returns
diff --git a/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py b/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py
index 24bb727..734b438 100644
--- a/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py
+++ b/python/mxnet/contrib/onnx/onnx2mx/_op_translations.py
@@ -24,7 +24,7 @@ from .... import symbol
 # Method definitions for the callable objects mapped in the import_helper module
 
 def identity(attrs, inputs, proto_obj):
-    """Returns the identity function of the the input."""
+    """Returns the identity function of the input."""
     return 'identity', attrs, inputs
 
 def random_uniform(attrs, inputs, proto_obj):
diff --git a/python/mxnet/gluon/data/dataloader.py b/python/mxnet/gluon/data/dataloader.py
index accd968..65fd7d8 100644
--- a/python/mxnet/gluon/data/dataloader.py
+++ b/python/mxnet/gluon/data/dataloader.py
@@ -268,7 +268,7 @@ class _MultiWorkerIterV1(object):
         if not self._shutdown:
             # send shutdown signal to the fetcher and join data queue first
             # Remark:   loop_fetcher need to be joined prior to the workers.
-            #           otherwise, the the fetcher may fail at getting data
+            #           otherwise, the fetcher may fail at getting data
             self._data_queue.put((None, None))
             self._fetcher.join()
             # send shutdown signal to all worker processes
diff --git a/python/mxnet/module/base_module.py b/python/mxnet/module/base_module.py
index 754e369..0d5515f 100644
--- a/python/mxnet/module/base_module.py
+++ b/python/mxnet/module/base_module.py
@@ -619,7 +619,7 @@ class BaseModule(object):
     # Parameters of a module
     ################################################################################
     def get_params(self):
-        """Gets parameters, those are potentially copies of the the actual parameters used
+        """Gets parameters, those are potentially copies of the actual parameters used
         to do computation on the device.
 
         Returns
diff --git a/python/mxnet/module/python_module.py b/python/mxnet/module/python_module.py
index df1648e..a5d6f15 100644
--- a/python/mxnet/module/python_module.py
+++ b/python/mxnet/module/python_module.py
@@ -94,7 +94,7 @@ class PythonModule(BaseModule):
     # Parameters of a module
     ################################################################################
     def get_params(self):
-        """Gets parameters, those are potentially copies of the the actual parameters used
+        """Gets parameters, those are potentially copies of the actual parameters used
         to do computation on the device. Subclass should override this method if contains
         parameters.
 
diff --git a/scala-package/core/src/main/scala/org/apache/mxnet/module/BaseModule.scala b/scala-package/core/src/main/scala/org/apache/mxnet/module/BaseModule.scala
index 7fbdae5..f2f4c20 100644
--- a/scala-package/core/src/main/scala/org/apache/mxnet/module/BaseModule.scala
+++ b/scala-package/core/src/main/scala/org/apache/mxnet/module/BaseModule.scala
@@ -306,7 +306,7 @@ abstract class BaseModule {
 
   // Parameters of a module
   /**
-   * Get parameters, those are potentially copies of the the actual parameters used
+   * Get parameters, those are potentially copies of the actual parameters used
    * to do computation on the device.
    * @return `(argParams, auxParams)`, a pair of dictionary of name to value mapping.
    */
diff --git a/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md b/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md
index 4c4512f..f6f8b67 100644
--- a/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md
+++ b/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/objectdetector/README.md
@@ -103,7 +103,7 @@ Class: dog
 Probabilties: 0.8226818
 (Coord:,83.82353,179.13998,206.63783,476.7875)
 ```
-the outputs come from the the input image, with top3 predictions picked.
+the outputs come from the input image, with top3 predictions picked.
 
 
 ## Infer API Details
diff --git a/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md b/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md
index 09189cb..02e09f3 100644
--- a/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md
+++ b/scala-package/examples/src/main/java/org/apache/mxnetexamples/javaapi/infer/predictor/README.md
@@ -75,4 +75,4 @@ Probability : 0.30337515 Class : n02123159 tiger cat
 Predict with NDArray
 Probability : 0.30337515 Class : n02123159 tiger cat
 ```
-the outputs come from the the input image, with top1 predictions picked.
\ No newline at end of file
+the outputs come from the input image, with top1 predictions picked.
diff --git a/scala-package/examples/src/main/scala/org/apache/mxnetexamples/infer/objectdetector/README.md b/scala-package/examples/src/main/scala/org/apache/mxnetexamples/infer/objectdetector/README.md
index e5d3bbe..25f040e 100644
--- a/scala-package/examples/src/main/scala/org/apache/mxnetexamples/infer/objectdetector/README.md
+++ b/scala-package/examples/src/main/scala/org/apache/mxnetexamples/infer/objectdetector/README.md
@@ -103,7 +103,7 @@ Class: dog
 Probabilties: 0.8226818
 (Coord:,83.82353,179.13998,206.63783,476.7875)
 ```
-the outputs come from the the input image, with top3 predictions picked.
+the outputs come from the input image, with top3 predictions picked.
 
 
 ## Infer API Details
diff --git a/src/operator/tensor/diag_op-inl.h b/src/operator/tensor/diag_op-inl.h
index c95c1ce..73eb4e1 100644
--- a/src/operator/tensor/diag_op-inl.h
+++ b/src/operator/tensor/diag_op-inl.h
@@ -71,7 +71,7 @@ inline mxnet::TShape DiagShapeImpl(const mxnet::TShape& ishape, const int k,
   int32_t x1 = CheckAxis(axis1, ishape.ndim());
   int32_t x2 = CheckAxis(axis2, ishape.ndim());
 
-  CHECK_NE(x1, x2) << "axis1 and axis2 cannot refer to the the same axis " << x1;
+  CHECK_NE(x1, x2) << "axis1 and axis2 cannot refer to the same axis " << x1;
 
   auto h = ishape[x1];
   auto w = ishape[x2];
diff --git a/src/operator/tensor/matrix_op.cc b/src/operator/tensor/matrix_op.cc
index c2bcb29..e78050a 100644
--- a/src/operator/tensor/matrix_op.cc
+++ b/src/operator/tensor/matrix_op.cc
@@ -270,7 +270,7 @@ NNVM_REGISTER_OP(Flatten)
 For an input array with shape ``(d1, d2, ..., dk)``, `flatten` operation reshapes
 the input array into an output array of shape ``(d1, d2*...*dk)``.
 
-Note that the bahavior of this function is different from numpy.ndarray.flatten,
+Note that the behavior of this function is different from numpy.ndarray.flatten,
 which behaves similar to mxnet.ndarray.reshape((-1,)).
 
 Example::
diff --git a/tools/staticbuild/README.md b/tools/staticbuild/README.md
index bfccbab..b861cc3 100644
--- a/tools/staticbuild/README.md
+++ b/tools/staticbuild/README.md
@@ -34,7 +34,7 @@ This would build the mxnet package based on MKLDNN and and pypi configuration se
 As the result, users would have a complete static dependencies in `/staticdeps` in the root folder as well as a static-linked `libmxnet.so` file lives in `lib`. You can build your language binding by using the `libmxnet.so`.
 
 ## `build_lib.sh`
-This script clones the most up-to-date master and builds the MXNet backend with a static library. In order to run the static library, you must set the the following environment variables:
+This script clones the most up-to-date master and builds the MXNet backend with a static library. In order to run the static library, you must set the following environment variables:
 
 - `DEPS_PATH` Path to your static dependencies
 - `STATIC_BUILD_TARGET` Either `pip` or `maven` as your publish platform
@@ -46,4 +46,4 @@ It is not recommended to run this file alone since there are a bunch of variable
 After running this script, you would have everything you need ready in the `/lib` folder.
 
 ## `build_wheel.sh`
-This script builds the python package. It also runs a sanity test.
\ No newline at end of file
+This script builds the python package. It also runs a sanity test.