You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by sk...@apache.org on 2018/01/17 03:41:37 UTC

[incubator-mxnet-site] branch asf-site updated: removed torch.html and references. Fixed Nesterov Momentum education link (#46)

This is an automated email from the ASF dual-hosted git repository.

skm pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new 2075c26  removed torch.html and references. Fixed Nesterov Momentum education link (#46)
2075c26 is described below

commit 2075c265433a3163f6dc87a46812d745884089a6
Author: thinksanky <31...@users.noreply.github.com>
AuthorDate: Tue Jan 16 19:41:35 2018 -0800

    removed torch.html and references. Fixed Nesterov Momentum education link (#46)
---
 _modules/mxnet/optimizer.html                      |   2 +-
 api/python/model.html                              |   2 +-
 api/python/optimization.html                       |   2 +-
 api/python/optimization/optimization.html          |   2 +-
 faq/develop_and_hack.html                          |   1 -
 faq/index.html                                     |   2 -
 faq/torch.html                                     | 315 ---------------------
 how_to/develop_and_hack.html                       |   1 -
 how_to/index.html                                  |   2 -
 how_to/torch.html                                  | 264 -----------------
 versions/0.11.0/api/python/model.html              |   2 +-
 versions/0.11.0/api/python/optimization.html       |   2 +-
 versions/0.11.0/how_to/develop_and_hack.html       |   1 -
 versions/0.11.0/how_to/index.html                  |   2 -
 versions/0.12.0/api/python/model.html              |   2 +-
 versions/0.12.0/api/python/optimization.html       |   2 +-
 .../api/python/optimization/optimization.html      |   2 +-
 versions/0.12.0/faq/develop_and_hack.html          |   1 -
 versions/0.12.0/faq/index.html                     |   2 -
 versions/0.12.0/how_to/develop_and_hack.html       |   1 -
 versions/0.12.0/how_to/index.html                  |   2 -
 versions/0.12.1/api/python/model.html              |   2 +-
 versions/0.12.1/api/python/optimization.html       |   2 +-
 .../api/python/optimization/optimization.html      |   2 +-
 versions/0.12.1/faq/develop_and_hack.html          |   1 -
 versions/0.12.1/faq/index.html                     |   2 -
 versions/0.12.1/how_to/develop_and_hack.html       |   1 -
 versions/0.12.1/how_to/index.html                  |   2 -
 versions/master/_modules/mxnet/optimizer.html      |   2 +-
 versions/master/api/python/model.html              |   2 +-
 versions/master/api/python/optimization.html       |   2 +-
 .../api/python/optimization/optimization.html      |   2 +-
 versions/master/how_to/develop_and_hack.html       |   1 -
 versions/master/how_to/index.html                  |   1 -
 34 files changed, 16 insertions(+), 618 deletions(-)

diff --git a/_modules/mxnet/optimizer.html b/_modules/mxnet/optimizer.html
index ec3ead4..9b29ac8 100644
--- a/_modules/mxnet/optimizer.html
+++ b/_modules/mxnet/optimizer.html
@@ -1267,7 +1267,7 @@
 
 <span class="sd">    Much like Adam is essentially RMSprop with momentum,</span>
 <span class="sd">    Nadam is Adam RMSprop with Nesterov momentum available</span>
-<span class="sd">    at http://cs229.stanford.edu/proj2015/054_report.pdf.</span>
+<span class="sd">    at https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ.</span>
 
 <span class="sd">    This optimizer accepts the following parameters in addition to those accepted</span>
 <span class="sd">    by :class:`.Optimizer`.</span>
diff --git a/api/python/model.html b/api/python/model.html
index 321daa1..bd9979b 100644
--- a/api/python/model.html
+++ b/api/python/model.html
@@ -2187,7 +2187,7 @@ by <a class="reference internal" href="optimization/optimization.html#mxnet.opti
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="optimization/optimization.html#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/api/python/optimization.html b/api/python/optimization.html
index bb8cf9c..3c563e6 100644
--- a/api/python/optimization.html
+++ b/api/python/optimization.html
@@ -811,7 +811,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/api/python/optimization/optimization.html b/api/python/optimization/optimization.html
index 03e4c76..4fb684e 100644
--- a/api/python/optimization/optimization.html
+++ b/api/python/optimization/optimization.html
@@ -978,7 +978,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/faq/develop_and_hack.html b/faq/develop_and_hack.html
index b105ca9..1f06872 100644
--- a/faq/develop_and_hack.html
+++ b/faq/develop_and_hack.html
@@ -207,7 +207,6 @@
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/faq/index.html b/faq/index.html
index aedc70f..c74d9f6 100644
--- a/faq/index.html
+++ b/faq/index.html
@@ -229,7 +229,6 @@
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html">How do I implement operators in MXNet backend?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -303,7 +302,6 @@ and full working examples, visit the <a class="reference internal" href="../tuto
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html">How do I implement operators in MXNet backend?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/faq/torch.html b/faq/torch.html
deleted file mode 100644
index f3553a0..0000000
--- a/faq/torch.html
+++ /dev/null
@@ -1,315 +0,0 @@
-<!DOCTYPE html>
-
-<html lang="en">
-<head>
-<meta charset="utf-8"/>
-<meta content="IE=edge" http-equiv="X-UA-Compatible"/>
-<meta content="width=device-width, initial-scale=1" name="viewport"/>
-<title>How to Use MXNet As an (Almost) Full-function Torch Front End — mxnet  documentation</title>
-<link crossorigin="anonymous" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css" integrity="sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7" rel="stylesheet"/>
-<link href="https://maxcdn.bootstrapcdn.com/font-awesome/4.5.0/css/font-awesome.min.css" rel="stylesheet"/>
-<link href="../_static/basic.css" rel="stylesheet" type="text/css">
-<link href="../_static/pygments.css" rel="stylesheet" type="text/css">
-<link href="../_static/mxnet.css" rel="stylesheet" type="text/css"/>
-<script type="text/javascript">
-      var DOCUMENTATION_OPTIONS = {
-        URL_ROOT:    '../',
-        VERSION:     '',
-        COLLAPSE_INDEX: false,
-        FILE_SUFFIX: '.html',
-        HAS_SOURCE:  true,
-        SOURCELINK_SUFFIX: ''
-      };
-    </script>
-<script src="https://code.jquery.com/jquery-1.11.1.min.js" type="text/javascript"></script>
-<script src="../_static/underscore.js" type="text/javascript"></script>
-<script src="../_static/searchtools_custom.js" type="text/javascript"></script>
-<script src="../_static/doctools.js" type="text/javascript"></script>
-<script src="../_static/selectlang.js" type="text/javascript"></script>
-<script src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS-MML_HTMLorMML" type="text/javascript"></script>
-<script type="text/javascript"> jQuery(function() { Search.loadIndex("/searchindex.js"); Search.init();}); </script>
-<script>
-      (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
-      (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new
-      Date();a=s.createElement(o),
-      m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
-      })(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
-
-      ga('create', 'UA-96378503-1', 'auto');
-      ga('send', 'pageview');
-
-    </script>
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/jquery.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/underscore.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/doctools.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML"></script> -->
-<!-- -->
-<link href="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/mxnet-icon.png" rel="icon" type="image/png"/>
-</link></link></head>
-<body background="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/mxnet-background-compressed.jpeg" role="document">
-<div class="content-block"><div class="navbar navbar-fixed-top">
-<div class="container" id="navContainer">
-<div class="innder" id="header-inner">
-<h1 id="logo-wrap">
-<a href="../" id="logo"><img src="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/mxnet_logo.png"/></a>
-</h1>
-<nav class="nav-bar" id="main-nav">
-<a class="main-nav-link" href="../install/index.html">Install</a>
-<a class="main-nav-link" href="../tutorials/index.html">Tutorials</a>
-<span id="dropdown-menu-position-anchor">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link dropdown-toggle" data-toggle="dropdown" href="#" role="button">Gluon <span class="caret"></span></a>
-<ul class="dropdown-menu navbar-menu" id="package-dropdown-menu">
-<li><a class="main-nav-link" href="../gluon/index.html">About</a></li>
-<li><a class="main-nav-link" href="http://gluon.mxnet.io">Tutorials</a></li>
-</ul>
-</span>
-<span id="dropdown-menu-position-anchor">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link dropdown-toggle" data-toggle="dropdown" href="#" role="button">API <span class="caret"></span></a>
-<ul class="dropdown-menu navbar-menu" id="package-dropdown-menu">
-<li><a class="main-nav-link" href="../api/python/index.html">Python</a></li>
-<li><a class="main-nav-link" href="../api/scala/index.html">Scala</a></li>
-<li><a class="main-nav-link" href="../api/r/index.html">R</a></li>
-<li><a class="main-nav-link" href="../api/julia/index.html">Julia</a></li>
-<li><a class="main-nav-link" href="../api/c++/index.html">C++</a></li>
-<li><a class="main-nav-link" href="../api/perl/index.html">Perl</a></li>
-</ul>
-</span>
-<span id="dropdown-menu-position-anchor-docs">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link dropdown-toggle" data-toggle="dropdown" href="#" role="button">Docs <span class="caret"></span></a>
-<ul class="dropdown-menu navbar-menu" id="package-dropdown-menu-docs">
-<li><a class="main-nav-link" href="../faq/index.html">FAQ</a></li>
-<li><a class="main-nav-link" href="../architecture/index.html">Architecture</a></li>
-<li><a class="main-nav-link" href="https://github.com/apache/incubator-mxnet/tree/1.0.0/example">Examples</a></li>
-<li><a class="main-nav-link" href="../model_zoo/index.html">Model Zoo</a></li>
-</ul>
-</span>
-<a class="main-nav-link" href="https://github.com/dmlc/mxnet">Github</a>
-<span id="dropdown-menu-position-anchor-community">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link dropdown-toggle" data-toggle="dropdown" href="#" role="button">Community <span class="caret"></span></a>
-<ul class="dropdown-menu navbar-menu" id="package-dropdown-menu-community">
-<li><a class="main-nav-link" href="../community/index.html">Community</a></li>
-<li><a class="main-nav-link" href="../community/contribute.html">Contribute</a></li>
-<li><a class="main-nav-link" href="../community/powered_by.html">Powered By</a></li>
-</ul>
-</span>
-<a class="main-nav-link" href="http://discuss.mxnet.io">Discuss</a>
-<span id="dropdown-menu-position-anchor-version" style="position: relative"><a href="#" class="main-nav-link dropdown-toggle" data-toggle="dropdown" role="button" aria-haspopup="true" aria-expanded="true">Versions(1.0.0)<span class="caret"></span></a><ul id="package-dropdown-menu" class="dropdown-menu"><li><a class="main-nav-link" href=https://mxnet.incubator.apache.org/>1.0.0</a></li><li><a class="main-nav-link" href=https://mxnet.incubator.apache.org/versions/0.12.1/index.html>0.12.1</ [...]
-<script> function getRootPath(){ return "../" } </script>
-<div class="burgerIcon dropdown">
-<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button">☰</a>
-<ul class="dropdown-menu" id="burgerMenu">
-<li><a href="../install/index.html">Install</a></li>
-<li><a class="main-nav-link" href="../tutorials/index.html">Tutorials</a></li>
-<li class="dropdown-submenu">
-<a href="#" tabindex="-1">Community</a>
-<ul class="dropdown-menu">
-<li><a href="../community/index.html" tabindex="-1">Community</a></li>
-<li><a href="../community/contribute.html" tabindex="-1">Contribute</a></li>
-<li><a href="../community/powered_by.html" tabindex="-1">Powered By</a></li>
-</ul>
-</li>
-<li class="dropdown-submenu">
-<a href="#" tabindex="-1">API</a>
-<ul class="dropdown-menu">
-<li><a href="../api/python/index.html" tabindex="-1">Python</a>
-</li>
-<li><a href="../api/scala/index.html" tabindex="-1">Scala</a>
-</li>
-<li><a href="../api/r/index.html" tabindex="-1">R</a>
-</li>
-<li><a href="../api/julia/index.html" tabindex="-1">Julia</a>
-</li>
-<li><a href="../api/c++/index.html" tabindex="-1">C++</a>
-</li>
-<li><a href="../api/perl/index.html" tabindex="-1">Perl</a>
-</li>
-</ul>
-</li>
-<li class="dropdown-submenu">
-<a href="#" tabindex="-1">Docs</a>
-<ul class="dropdown-menu">
-<li><a href="../tutorials/index.html" tabindex="-1">Tutorials</a></li>
-<li><a href="../faq/index.html" tabindex="-1">FAQ</a></li>
-<li><a href="../architecture/index.html" tabindex="-1">Architecture</a></li>
-<li><a href="https://github.com/apache/incubator-mxnet/tree/1.0.0/example" tabindex="-1">Examples</a></li>
-<li><a href="../model_zoo/index.html" tabindex="-1">Model Zoo</a></li>
-</ul>
-</li>
-<li><a href="../architecture/index.html">Architecture</a></li>
-<li><a class="main-nav-link" href="https://github.com/dmlc/mxnet">Github</a></li>
-<li id="dropdown-menu-position-anchor-version-mobile" class="dropdown-submenu" style="position: relative"><a href="#" tabindex="-1">Versions(1.0.0)</a><ul class="dropdown-menu"><li><a tabindex="-1" href=https://mxnet.incubator.apache.org/>1.0.0</a></li><li><a tabindex="-1" href=https://mxnet.incubator.apache.org/versions/0.12.1/index.html>0.12.1</a></li><li><a tabindex="-1" href=https://mxnet.incubator.apache.org/versions/0.12.0/index.html>0.12.0</a></li><li><a tabindex="-1" href=https:/ [...]
-</div>
-<div class="plusIcon dropdown">
-<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button"><span aria-hidden="true" class="glyphicon glyphicon-plus"></span></a>
-<ul class="dropdown-menu dropdown-menu-right" id="plusMenu"></ul>
-</div>
-<div id="search-input-wrap">
-<form action="../search.html" autocomplete="off" class="" method="get" role="search">
-<div class="form-group inner-addon left-addon">
-<i class="glyphicon glyphicon-search"></i>
-<input class="form-control" name="q" placeholder="Search" type="text"/>
-</div>
-<input name="check_keywords" type="hidden" value="yes">
-<input name="area" type="hidden" value="default"/>
-</input></form>
-<div id="search-preview"></div>
-</div>
-<div id="searchIcon">
-<span aria-hidden="true" class="glyphicon glyphicon-search"></span>
-</div>
-<!-- <div id="lang-select-wrap"> -->
-<!--   <label id="lang-select-label"> -->
-<!--     <\!-- <i class="fa fa-globe"></i> -\-> -->
-<!--     <span></span> -->
-<!--   </label> -->
-<!--   <select id="lang-select"> -->
-<!--     <option value="en">Eng</option> -->
-<!--     <option value="zh">中文</option> -->
-<!--   </select> -->
-<!-- </div> -->
-<!--     <a id="mobile-nav-toggle">
-        <span class="mobile-nav-toggle-bar"></span>
-        <span class="mobile-nav-toggle-bar"></span>
-        <span class="mobile-nav-toggle-bar"></span>
-      </a> -->
-</div>
-</div>
-</div>
-<script type="text/javascript">
-        $('body').css('background', 'white');
-    </script>
-<div class="container">
-<div class="row">
-<div aria-label="main navigation" class="sphinxsidebar leftsidebar" role="navigation">
-<div class="sphinxsidebarwrapper">
-<ul>
-<li class="toctree-l1"><a class="reference internal" href="../api/python/index.html">Python Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/r/index.html">R Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/julia/index.html">Julia Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/c++/index.html">C++ Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/scala/index.html">Scala Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/perl/index.html">Perl Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="index.html">HowTo Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../architecture/index.html">System Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../tutorials/index.html">Tutorials</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../community/index.html">Community</a></li>
-</ul>
-</div>
-</div>
-<div class="content">
-<div class="page-tracker"></div>
-<div class="section" id="how-to-use-mxnet-as-an-almost-full-function-torch-front-end">
-<span id="how-to-use-mxnet-as-an-almost-full-function-torch-front-end"></span><h1>How to Use MXNet As an (Almost) Full-function Torch Front End<a class="headerlink" href="#how-to-use-mxnet-as-an-almost-full-function-torch-front-end" title="Permalink to this headline">¶</a></h1>
-<p>This topic demonstrates how to use MXNet as a front end to two of Torch’s major functionalities:</p>
-<ul class="simple">
-<li>Call Torch’s tensor mathematical functions with MXNet.NDArray</li>
-<li>Embed Torch’s neural network modules (layers) into MXNet’s symbolic graph</li>
-</ul>
-<div class="section" id="compile-with-torch">
-<span id="compile-with-torch"></span><h2>Compile with Torch<a class="headerlink" href="#compile-with-torch" title="Permalink to this headline">¶</a></h2>
-<ul class="simple">
-<li>Install Torch using the <a class="reference external" href="http://torch.ch/docs/getting-started.html">official guide</a>.<ul>
-<li>If you haven’t already done so, copy <code class="docutils literal"><span class="pre">make/config.mk</span></code> (Linux) or <code class="docutils literal"><span class="pre">make/osx.mk</span></code> (Mac) into the MXNet root folder as <code class="docutils literal"><span class="pre">config.mk</span></code>. In <code class="docutils literal"><span class="pre">config.mk</span></code> uncomment the lines <code class="docutils literal"><span class="pre">TORCH_PATH</span> <span class="p [...]
-<li>By default, Torch should be installed in your home folder (so <code class="docutils literal"><span class="pre">TORCH_PATH</span> <span class="pre">=</span> <span class="pre">$(HOME)/torch</span></code>). Modify TORCH_PATH to point to your torch installation, if necessary.</li>
-</ul>
-</li>
-<li>Run <code class="docutils literal"><span class="pre">make</span> <span class="pre">clean</span> <span class="pre">&amp;&amp;</span> <span class="pre">make</span></code> to build MXNet with Torch support.</li>
-</ul>
-</div>
-<div class="section" id="tensor-mathematics">
-<span id="tensor-mathematics"></span><h2>Tensor Mathematics<a class="headerlink" href="#tensor-mathematics" title="Permalink to this headline">¶</a></h2>
-<p>The mxnet.th module supports calling Torch’s tensor mathematical functions with mxnet.nd.NDArray. See <a class="reference external" href="https://github.com/dmlc/mxnet/blob/master/example/torch/torch_function.py">complete code</a>:</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    <span class="kn">import</span> <span class="nn">mxnet</span> <span class="kn">as</span> <span class="nn">mx</span>
-    <span class="n">x</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">th</span><span class="o">.</span><span class="n">randn</span><span class="p">(</span><span class="mi">2</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="n">ctx</span><span class="o">=</span><span class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span class="p">(</span><span class="mi">0</span><span class="p [...]
-    <span class="k">print</span> <span class="n">x</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-    <span class="n">y</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">th</span><span class="o">.</span><span class="n">abs</span><span class="p">(</span><span class="n">x</span><span class="p">)</span>
-    <span class="k">print</span> <span class="n">y</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-
-    <span class="n">x</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">th</span><span class="o">.</span><span class="n">randn</span><span class="p">(</span><span class="mi">2</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="n">ctx</span><span class="o">=</span><span class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span class="p">(</span><span class="mi">0</span><span class="p [...]
-    <span class="k">print</span> <span class="n">x</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-    <span class="n">mx</span><span class="o">.</span><span class="n">th</span><span class="o">.</span><span class="n">abs</span><span class="p">(</span><span class="n">x</span><span class="p">,</span> <span class="n">x</span><span class="p">)</span> <span class="c1"># in-place</span>
-    <span class="k">print</span> <span class="n">x</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-</pre></div>
-</div>
-<p>For help, use the <code class="docutils literal"><span class="pre">help(mx.th)</span></code> command.</p>
-<p>We’ve added support for most common functions listed on <a class="reference external" href="https://github.com/torch/torch7/blob/master/doc/maths.md">Torch’s documentation page</a>.
-If you find that the function you need is not supported, you can easily register it in <code class="docutils literal"><span class="pre">mxnet_root/plugin/torch/torch_function.cc</span></code> by using the existing registrations as examples.</p>
-</div>
-<div class="section" id="torch-modules-layers">
-<span id="torch-modules-layers"></span><h2>Torch Modules (Layers)<a class="headerlink" href="#torch-modules-layers" title="Permalink to this headline">¶</a></h2>
-<p>MXNet supports Torch’s neural network modules through  the<code class="docutils literal"><span class="pre">mxnet.symbol.TorchModule</span></code> symbol.
-For example, the following code defines a three-layer DNN for classifying MNIST digits (<a class="reference external" href="https://github.com/dmlc/mxnet/blob/master/example/torch/torch_module.py">full code</a>):</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    <span class="n">data</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="s1">'data'</span><span class="p">)</span>
-    <span class="n">fc1</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">data</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.Linear(784, 128)'</span><span class="p">,</span> <span class="n">num_data</span><span class=" [...]
-    <span class="n">act1</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">fc1</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.ReLU(false)'</span><span class="p">,</span> <span class="n">num_data</span><span class="o">=< [...]
-    <span class="n">fc2</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">act1</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.Linear(128, 64)'</span><span class="p">,</span> <span class="n">num_data</span><span class="o [...]
-    <span class="n">act2</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">fc2</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.ReLU(false)'</span><span class="p">,</span> <span class="n">num_data</span><span class="o">=< [...]
-    <span class="n">fc3</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">act2</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.Linear(64, 10)'</span><span class="p">,</span> <span class="n">num_data</span><span class="o" [...]
-    <span class="n">mlp</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">SoftmaxOutput</span><span class="p">(</span><span class="n">data</span><span class="o">=</span><span class="n">fc3</span><span class="p">,</span> <span class="n">name</span><span class="o">=</span><span class="s1">'softmax'</span><span class="p">)</span>
-</pre></div>
-</div>
-<p>Let’s break it down. First <code class="docutils literal"><span class="pre">data</span> <span class="pre">=</span> <span class="pre">mx.symbol.Variable('data')</span></code> defines a Variable as a placeholder for input.
-Then, it’s fed through Torch’s nn modules with:
-<code class="docutils literal"><span class="pre">fc1</span> <span class="pre">=</span> <span class="pre">mx.symbol.TorchModule(data_0=data,</span> <span class="pre">lua_string='nn.Linear(784,</span> <span class="pre">128)',</span> <span class="pre">num_data=1,</span> <span class="pre">num_params=2,</span> <span class="pre">num_outputs=1,</span> <span class="pre">name='fc1')</span></code>.
-To use Torch’s criterion as loss functions, you can replace the last line with:</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    <span class="n">logsoftmax</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">fc3</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.LogSoftMax()'</s [...]
-    <span class="c1"># Torch's label starts from 1</span>
-    <span class="n">label</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="s1">'softmax_label'</span><span class="p">)</span> <span class="o">+</span> <span class="mi">1</span>
-    <span class="n">mlp</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchCriterion</span><span class="p">(</span><span class="n">data</span><span class="o">=</span><span class="n">logsoftmax</span><span class="p">,</span> <span class="n">label</span><span class="o">=</span><span class="n">label</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><s [...]
-</pre></div>
-</div>
-<p>The input to the nn module is named data_i for i = 0 ... num_data-1. <code class="docutils literal"><span class="pre">lua_string</span></code> is a single Lua statement that creates the module object.
-For Torch’s built-in module, this is simply <code class="docutils literal"><span class="pre">nn.module_name(arguments)</span></code>.
-If you are using a custom module, place it in a .lua script file and load it with <code class="docutils literal"><span class="pre">require</span> <span class="pre">'module_file.lua'</span></code> if your script returns a torch.nn object, or <code class="docutils literal"><span class="pre">(require</span> <span class="pre">'module_file.lua')()</span></code> if your script returns a torch.nn class.</p>
-</div>
-</div>
-</div>
-</div>
-<div aria-label="main navigation" class="sphinxsidebar rightsidebar" role="navigation">
-<div class="sphinxsidebarwrapper">
-<h3><a href="../index.html">Table Of Contents</a></h3>
-<ul>
-<li><a class="reference internal" href="#">How to Use MXNet As an (Almost) Full-function Torch Front End</a><ul>
-<li><a class="reference internal" href="#compile-with-torch">Compile with Torch</a></li>
-<li><a class="reference internal" href="#tensor-mathematics">Tensor Mathematics</a></li>
-<li><a class="reference internal" href="#torch-modules-layers">Torch Modules (Layers)</a></li>
-</ul>
-</li>
-</ul>
-</div>
-</div>
-</div><div class="footer">
-<div class="section-disclaimer">
-<div class="container">
-<div>
-<img height="60" src="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/apache_incubator_logo.png"/>
-<p>
-            Apache MXNet is an effort undergoing incubation at The Apache Software Foundation (ASF), <strong>sponsored by the <i>Apache Incubator</i></strong>. Incubation is required of all newly accepted projects until a further review indicates that the infrastructure, communications, and decision making process have stabilized in a manner consistent with other successful ASF projects. While incubation status is not necessarily a reflection of the completeness or stability of the code, [...]
-        </p>
-<p>
-            "Copyright © 2017, The Apache Software Foundation
-            Apache MXNet, MXNet, Apache, the Apache feather, and the Apache MXNet project logo are either registered trademarks or trademarks of the Apache Software Foundation."
-        </p>
-</div>
-</div>
-</div>
-</div> <!-- pagename != index -->
-</div>
-<script crossorigin="anonymous" integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS" src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js"></script>
-<script src="../_static/js/sidebar.js" type="text/javascript"></script>
-<script src="../_static/js/search.js" type="text/javascript"></script>
-<script src="../_static/js/navbar.js" type="text/javascript"></script>
-<script src="../_static/js/clipboard.min.js" type="text/javascript"></script>
-<script src="../_static/js/copycode.js" type="text/javascript"></script>
-<script src="../_static/js/page.js" type="text/javascript"></script>
-<script type="text/javascript">
-        $('body').ready(function () {
-            $('body').css('visibility', 'visible');
-        });
-    </script>
-</body>
-</html>
\ No newline at end of file
diff --git a/how_to/develop_and_hack.html b/how_to/develop_and_hack.html
index ddc5aa9..bbb4c1d 100644
--- a/how_to/develop_and_hack.html
+++ b/how_to/develop_and_hack.html
@@ -169,7 +169,6 @@
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/how_to/index.html b/how_to/index.html
index 380d51c..f88b0b6 100644
--- a/how_to/index.html
+++ b/how_to/index.html
@@ -188,7 +188,6 @@
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -251,7 +250,6 @@ and full working examples, visit the <a class="reference internal" href="../tuto
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/how_to/torch.html b/how_to/torch.html
deleted file mode 100644
index f69a78f..0000000
--- a/how_to/torch.html
+++ /dev/null
@@ -1,264 +0,0 @@
-<!DOCTYPE html>
-
-<html lang="en">
-<head>
-<meta charset="utf-8"/>
-<meta content="IE=edge" http-equiv="X-UA-Compatible"/>
-<meta content="width=device-width, initial-scale=1" name="viewport"/>
-<title>How to Use MXNet As an (Almost) Full-function Torch Front End — mxnet  documentation</title>
-<link crossorigin="anonymous" href="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/css/bootstrap.min.css" integrity="sha384-1q8mTJOASx8j1Au+a5WDVnPi2lkFfwwEAa8hDDdjZlpLegxhjVME1fgjWPGmkzs7" rel="stylesheet"/>
-<link href="https://maxcdn.bootstrapcdn.com/font-awesome/4.5.0/css/font-awesome.min.css" rel="stylesheet"/>
-<link href="../_static/basic.css" rel="stylesheet" type="text/css">
-<link href="../_static/pygments.css" rel="stylesheet" type="text/css">
-<link href="../_static/mxnet.css" rel="stylesheet" type="text/css"/>
-<script type="text/javascript">
-      var DOCUMENTATION_OPTIONS = {
-        URL_ROOT:    '../',
-        VERSION:     '',
-        COLLAPSE_INDEX: false,
-        FILE_SUFFIX: '.html',
-        HAS_SOURCE:  true,
-        SOURCELINK_SUFFIX: ''
-      };
-    </script>
-<script src="../_static/jquery-1.11.1.js" type="text/javascript"></script>
-<script src="../_static/underscore.js" type="text/javascript"></script>
-<script src="../_static/searchtools_custom.js" type="text/javascript"></script>
-<script src="../_static/doctools.js" type="text/javascript"></script>
-<script src="../_static/selectlang.js" type="text/javascript"></script>
-<script src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML" type="text/javascript"></script>
-<script type="text/javascript"> jQuery(function() { Search.loadIndex("/searchindex.js"); Search.init();}); </script>
-<script>
-      (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
-      (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new
-      Date();a=s.createElement(o),
-      m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
-      })(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
-
-      ga('create', 'UA-96378503-1', 'auto');
-      ga('send', 'pageview');
-
-    </script>
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/jquery.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/underscore.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" src="../_static/doctools.js"></script> -->
-<!-- -->
-<!-- <script type="text/javascript" src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML"></script> -->
-<!-- -->
-<link href="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/image/mxnet-icon.png" rel="icon" type="image/png"/>
-</link></link></head>
-<body role="document"><div class="navbar navbar-fixed-top">
-<div class="container" id="navContainer">
-<div class="innder" id="header-inner">
-<h1 id="logo-wrap">
-<a href="../" id="logo"><img src="../_static/mxnet.png"/></a>
-</h1>
-<nav class="nav-bar" id="main-nav">
-<a class="main-nav-link" href="../get_started/install.html">Install</a>
-<a class="main-nav-link" href="../tutorials/index.html">Tutorials</a>
-<span id="dropdown-menu-position-anchor">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link dropdown-toggle" data-toggle="dropdown" href="#" role="button">Gluon <span class="caret"></span></a>
-<ul class="dropdown-menu" id="package-dropdown-menu">
-<li><a class="main-nav-link" href="../gluon/index.html">About</a></li>
-<li><a class="main-nav-link" href="http://gluon.mxnet.io/">Tutorials</a></li>
-</ul>
-</span>
-<a class="main-nav-link" href="../how_to/index.html">How To</a>
-<span id="dropdown-menu-position-anchor">
-<a aria-expanded="true" aria-haspopup="true" class="main-nav-link dropdown-toggle" data-toggle="dropdown" href="#" role="button">API <span class="caret"></span></a>
-<ul class="dropdown-menu" id="package-dropdown-menu">
-<li><a class="main-nav-link" href="../api/python/index.html">Python</a></li>
-<li><a class="main-nav-link" href="../api/scala/index.html">Scala</a></li>
-<li><a class="main-nav-link" href="../api/r/index.html">R</a></li>
-<li><a class="main-nav-link" href="../api/julia/index.html">Julia</a></li>
-<li><a class="main-nav-link" href="../api/c++/index.html">C++</a></li>
-<li><a class="main-nav-link" href="../api/perl/index.html">Perl</a></li>
-</ul>
-</span>
-<a class="main-nav-link" href="../architecture/index.html">Architecture</a>
-<!-- <a class="main-nav-link" href="../community/index.html">Community</a> -->
-<a class="main-nav-link" href="https://github.com/dmlc/mxnet">Github</a>
-<span id="dropdown-menu-position-anchor-version" style="position: relative"><a href="#" class="main-nav-link dropdown-toggle" data-toggle="dropdown" role="button" aria-haspopup="true" aria-expanded="true">Versions(0.11.0)<span class="caret"></span></a><ul id="package-dropdown-menu" class="dropdown-menu"><li><a class="main-nav-link" href=https://mxnet.incubator.apache.org/>0.11.0</a></li><li><a class="main-nav-link" href=https://mxnet.incubator.apache.org/versions/master/index.html>master [...]
-<script> function getRootPath(){ return "../" } </script>
-<div class="burgerIcon dropdown">
-<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button">☰</a>
-<ul class="dropdown-menu dropdown-menu-right" id="burgerMenu">
-<li><a href="../get_started/install.html">Install</a></li>
-<li><a href="../tutorials/index.html">Tutorials</a></li>
-<li><a href="../how_to/index.html">How To</a></li>
-<li class="dropdown-submenu">
-<a href="#" tabindex="-1">API</a>
-<ul class="dropdown-menu">
-<li><a href="../api/python/index.html" tabindex="-1">Python</a>
-</li>
-<li><a href="../api/scala/index.html" tabindex="-1">Scala</a>
-</li>
-<li><a href="../api/r/index.html" tabindex="-1">R</a>
-</li>
-<li><a href="../api/julia/index.html" tabindex="-1">Julia</a>
-</li>
-<li><a href="../api/c++/index.html" tabindex="-1">C++</a>
-</li>
-<li><a href="../api/perl/index.html" tabindex="-1">Perl</a>
-</li>
-</ul>
-</li>
-<li><a href="../architecture/index.html">Architecture</a></li>
-<li><a class="main-nav-link" href="https://github.com/dmlc/mxnet">Github</a></li>
-<li id="dropdown-menu-position-anchor-version-mobile" class="dropdown-submenu" style="position: relative"><a href="#" tabindex="-1">Versions(0.11.0)</a><ul class="dropdown-menu"><li><a tabindex="-1" href=https://mxnet.incubator.apache.org/>0.11.0</a></li><li><a tabindex="-1" href=https://mxnet.incubator.apache.org/versions/master/index.html>master</a></li></ul></li></ul>
-</div>
-<div class="plusIcon dropdown">
-<a class="dropdown-toggle" data-toggle="dropdown" href="#" role="button"><span aria-hidden="true" class="glyphicon glyphicon-plus"></span></a>
-<ul class="dropdown-menu dropdown-menu-right" id="plusMenu"></ul>
-</div>
-<div id="search-input-wrap">
-<form action="../search.html" autocomplete="off" class="" method="get" role="search">
-<div class="form-group inner-addon left-addon">
-<i class="glyphicon glyphicon-search"></i>
-<input class="form-control" name="q" placeholder="Search" type="text"/>
-</div>
-<input name="check_keywords" type="hidden" value="yes">
-<input name="area" type="hidden" value="default"/>
-</input></form>
-<div id="search-preview"></div>
-</div>
-<div id="searchIcon">
-<span aria-hidden="true" class="glyphicon glyphicon-search"></span>
-</div>
-<!-- <div id="lang-select-wrap"> -->
-<!--   <label id="lang-select-label"> -->
-<!--     <\!-- <i class="fa fa-globe"></i> -\-> -->
-<!--     <span></span> -->
-<!--   </label> -->
-<!--   <select id="lang-select"> -->
-<!--     <option value="en">Eng</option> -->
-<!--     <option value="zh">中文</option> -->
-<!--   </select> -->
-<!-- </div> -->
-<!--     <a id="mobile-nav-toggle">
-        <span class="mobile-nav-toggle-bar"></span>
-        <span class="mobile-nav-toggle-bar"></span>
-        <span class="mobile-nav-toggle-bar"></span>
-      </a> -->
-</div>
-</div>
-</div>
-<div class="container">
-<div class="row">
-<div aria-label="main navigation" class="sphinxsidebar leftsidebar" role="navigation">
-<div class="sphinxsidebarwrapper">
-<ul>
-<li class="toctree-l1"><a class="reference internal" href="../api/python/index.html">Python Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/r/index.html">R Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/julia/index.html">Julia Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/c++/index.html">C++ Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/scala/index.html">Scala Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../api/perl/index.html">Perl Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="index.html">HowTo Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../architecture/index.html">System Documents</a></li>
-<li class="toctree-l1"><a class="reference internal" href="../tutorials/index.html">Tutorials</a></li>
-</ul>
-</div>
-</div>
-<div class="content">
-<div class="section" id="how-to-use-mxnet-as-an-almost-full-function-torch-front-end">
-<span id="how-to-use-mxnet-as-an-almost-full-function-torch-front-end"></span><h1>How to Use MXNet As an (Almost) Full-function Torch Front End<a class="headerlink" href="#how-to-use-mxnet-as-an-almost-full-function-torch-front-end" title="Permalink to this headline">¶</a></h1>
-<p>This topic demonstrates how to use MXNet as a front end to two of Torch’s major functionalities:</p>
-<ul class="simple">
-<li>Call Torch’s tensor mathematical functions with MXNet.NDArray</li>
-<li>Embed Torch’s neural network modules (layers) into MXNet’s symbolic graph</li>
-</ul>
-<div class="section" id="compile-with-torch">
-<span id="compile-with-torch"></span><h2>Compile with Torch<a class="headerlink" href="#compile-with-torch" title="Permalink to this headline">¶</a></h2>
-<ul class="simple">
-<li>Install Torch using the <a class="reference external" href="http://torch.ch/docs/getting-started.html">official guide</a>.<ul>
-<li>If you haven’t already done so, copy <code class="docutils literal"><span class="pre">make/config.mk</span></code> (Linux) or <code class="docutils literal"><span class="pre">make/osx.mk</span></code> (Mac) into the MXNet root folder as <code class="docutils literal"><span class="pre">config.mk</span></code>. In <code class="docutils literal"><span class="pre">config.mk</span></code> uncomment the lines <code class="docutils literal"><span class="pre">TORCH_PATH</span> <span class="p [...]
-<li>By default, Torch should be installed in your home folder (so <code class="docutils literal"><span class="pre">TORCH_PATH</span> <span class="pre">=</span> <span class="pre">$(HOME)/torch</span></code>). Modify TORCH_PATH to point to your torch installation, if necessary.</li>
-</ul>
-</li>
-<li>Run <code class="docutils literal"><span class="pre">make</span> <span class="pre">clean</span> <span class="pre">&amp;&amp;</span> <span class="pre">make</span></code> to build MXNet with Torch support.</li>
-</ul>
-</div>
-<div class="section" id="tensor-mathematics">
-<span id="tensor-mathematics"></span><h2>Tensor Mathematics<a class="headerlink" href="#tensor-mathematics" title="Permalink to this headline">¶</a></h2>
-<p>The mxnet.th module supports calling Torch’s tensor mathematical functions with mxnet.nd.NDArray. See <a class="reference external" href="https://github.com/dmlc/mxnet/blob/master/example/torch/torch_function.py">complete code</a>:</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    <span class="kn">import</span> <span class="nn">mxnet</span> <span class="kn">as</span> <span class="nn">mx</span>
-    <span class="n">x</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">th</span><span class="o">.</span><span class="n">randn</span><span class="p">(</span><span class="mi">2</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="n">ctx</span><span class="o">=</span><span class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span class="p">(</span><span class="mi">0</span><span class="p [...]
-    <span class="k">print</span> <span class="n">x</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-    <span class="n">y</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">th</span><span class="o">.</span><span class="n">abs</span><span class="p">(</span><span class="n">x</span><span class="p">)</span>
-    <span class="k">print</span> <span class="n">y</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-
-    <span class="n">x</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">th</span><span class="o">.</span><span class="n">randn</span><span class="p">(</span><span class="mi">2</span><span class="p">,</span> <span class="mi">2</span><span class="p">,</span> <span class="n">ctx</span><span class="o">=</span><span class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span class="p">(</span><span class="mi">0</span><span class="p [...]
-    <span class="k">print</span> <span class="n">x</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-    <span class="n">mx</span><span class="o">.</span><span class="n">th</span><span class="o">.</span><span class="n">abs</span><span class="p">(</span><span class="n">x</span><span class="p">,</span> <span class="n">x</span><span class="p">)</span> <span class="c1"># in-place</span>
-    <span class="k">print</span> <span class="n">x</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
-</pre></div>
-</div>
-<p>For help, use the <code class="docutils literal"><span class="pre">help(mx.th)</span></code> command.</p>
-<p>We’ve added support for most common functions listed on <a class="reference external" href="https://github.com/torch/torch7/blob/master/doc/maths.md">Torch’s documentation page</a>.
-If you find that the function you need is not supported, you can easily register it in <code class="docutils literal"><span class="pre">mxnet_root/plugin/torch/torch_function.cc</span></code> by using the existing registrations as examples.</p>
-</div>
-<div class="section" id="torch-modules-layers">
-<span id="torch-modules-layers"></span><h2>Torch Modules (Layers)<a class="headerlink" href="#torch-modules-layers" title="Permalink to this headline">¶</a></h2>
-<p>MXNet supports Torch’s neural network modules through  the<code class="docutils literal"><span class="pre">mxnet.symbol.TorchModule</span></code> symbol.
-For example, the following code defines a three-layer DNN for classifying MNIST digits (<a class="reference external" href="https://github.com/dmlc/mxnet/blob/master/example/torch/torch_module.py">full code</a>):</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    <span class="n">data</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="s1">'data'</span><span class="p">)</span>
-    <span class="n">fc1</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">data</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.Linear(784, 128)'</span><span class="p">,</span> <span class="n">num_data</span><span class=" [...]
-    <span class="n">act1</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">fc1</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.ReLU(false)'</span><span class="p">,</span> <span class="n">num_data</span><span class="o">=< [...]
-    <span class="n">fc2</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">act1</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.Linear(128, 64)'</span><span class="p">,</span> <span class="n">num_data</span><span class="o [...]
-    <span class="n">act2</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">fc2</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.ReLU(false)'</span><span class="p">,</span> <span class="n">num_data</span><span class="o">=< [...]
-    <span class="n">fc3</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">act2</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.Linear(64, 10)'</span><span class="p">,</span> <span class="n">num_data</span><span class="o" [...]
-    <span class="n">mlp</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">SoftmaxOutput</span><span class="p">(</span><span class="n">data</span><span class="o">=</span><span class="n">fc3</span><span class="p">,</span> <span class="n">name</span><span class="o">=</span><span class="s1">'softmax'</span><span class="p">)</span>
-</pre></div>
-</div>
-<p>Let’s break it down. First <code class="docutils literal"><span class="pre">data</span> <span class="pre">=</span> <span class="pre">mx.symbol.Variable('data')</span></code> defines a Variable as a placeholder for input.
-Then, it’s fed through Torch’s nn modules with:
-<code class="docutils literal"><span class="pre">fc1</span> <span class="pre">=</span> <span class="pre">mx.symbol.TorchModule(data_0=data,</span> <span class="pre">lua_string='nn.Linear(784,</span> <span class="pre">128)',</span> <span class="pre">num_data=1,</span> <span class="pre">num_params=2,</span> <span class="pre">num_outputs=1,</span> <span class="pre">name='fc1')</span></code>.
-To use Torch’s criterion as loss functions, you can replace the last line with:</p>
-<div class="highlight-python"><div class="highlight"><pre><span></span>    <span class="n">logsoftmax</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchModule</span><span class="p">(</span><span class="n">data_0</span><span class="o">=</span><span class="n">fc3</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><span class="s1">'nn.LogSoftMax()'</s [...]
-    <span class="c1"># Torch's label starts from 1</span>
-    <span class="n">label</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">Variable</span><span class="p">(</span><span class="s1">'softmax_label'</span><span class="p">)</span> <span class="o">+</span> <span class="mi">1</span>
-    <span class="n">mlp</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">symbol</span><span class="o">.</span><span class="n">TorchCriterion</span><span class="p">(</span><span class="n">data</span><span class="o">=</span><span class="n">logsoftmax</span><span class="p">,</span> <span class="n">label</span><span class="o">=</span><span class="n">label</span><span class="p">,</span> <span class="n">lua_string</span><span class="o">=</span><s [...]
-</pre></div>
-</div>
-<p>The input to the nn module is named data_i for i = 0 ... num_data-1. <code class="docutils literal"><span class="pre">lua_string</span></code> is a single Lua statement that creates the module object.
-For Torch’s built-in module, this is simply <code class="docutils literal"><span class="pre">nn.module_name(arguments)</span></code>.
-If you are using a custom module, place it in a .lua script file and load it with <code class="docutils literal"><span class="pre">require</span> <span class="pre">'module_file.lua'</span></code> if your script returns a torch.nn object, or <code class="docutils literal"><span class="pre">(require</span> <span class="pre">'module_file.lua')()</span></code> if your script returns a torch.nn class.</p>
-</div>
-</div>
-<div class="container">
-<div class="footer">
-<p> </p>
-</div>
-</div>
-</div>
-<div aria-label="main navigation" class="sphinxsidebar rightsidebar" role="navigation">
-<div class="sphinxsidebarwrapper">
-<h3><a href="../index.html">Table Of Contents</a></h3>
-<ul>
-<li><a class="reference internal" href="#">How to Use MXNet As an (Almost) Full-function Torch Front End</a><ul>
-<li><a class="reference internal" href="#compile-with-torch">Compile with Torch</a></li>
-<li><a class="reference internal" href="#tensor-mathematics">Tensor Mathematics</a></li>
-<li><a class="reference internal" href="#torch-modules-layers">Torch Modules (Layers)</a></li>
-</ul>
-</li>
-</ul>
-</div>
-</div>
-</div> <!-- pagename != index -->
-<script crossorigin="anonymous" integrity="sha384-0mSbJDEHialfmuBBQP6A4Qrprq5OVfW37PRR3j5ELqxss1yVqOtnepnHVP9aJ7xS" src="https://maxcdn.bootstrapcdn.com/bootstrap/3.3.6/js/bootstrap.min.js"></script>
-<script src="../_static/js/sidebar.js" type="text/javascript"></script>
-<script src="../_static/js/search.js" type="text/javascript"></script>
-<script src="../_static/js/navbar.js" type="text/javascript"></script>
-<script src="../_static/js/clipboard.min.js" type="text/javascript"></script>
-<script src="../_static/js/copycode.js" type="text/javascript"></script>
-<script type="text/javascript">
-        $('body').ready(function () {
-            $('body').css('visibility', 'visible');
-        });
-    </script>
-</div></body>
-</html>
diff --git a/versions/0.11.0/api/python/model.html b/versions/0.11.0/api/python/model.html
index fc1d124..ba1dc53 100644
--- a/versions/0.11.0/api/python/model.html
+++ b/versions/0.11.0/api/python/model.html
@@ -1966,7 +1966,7 @@ by <a class="reference internal" href="optimization.html#mxnet.optimizer.Optimiz
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="optimization.html#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.11.0/api/python/optimization.html b/versions/0.11.0/api/python/optimization.html
index 52d6e1d..1abc12e 100644
--- a/versions/0.11.0/api/python/optimization.html
+++ b/versions/0.11.0/api/python/optimization.html
@@ -811,7 +811,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.11.0/how_to/develop_and_hack.html b/versions/0.11.0/how_to/develop_and_hack.html
index 7251e80..2872dcb 100644
--- a/versions/0.11.0/how_to/develop_and_hack.html
+++ b/versions/0.11.0/how_to/develop_and_hack.html
@@ -169,7 +169,6 @@
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.11.0/how_to/index.html b/versions/0.11.0/how_to/index.html
index 7b39bce..237cafc 100644
--- a/versions/0.11.0/how_to/index.html
+++ b/versions/0.11.0/how_to/index.html
@@ -188,7 +188,6 @@
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -251,7 +250,6 @@ and full working examples, visit the <a class="reference internal" href="../tuto
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/0.12.0/api/python/model.html b/versions/0.12.0/api/python/model.html
index 1fbcaa5..c998ee6 100644
--- a/versions/0.12.0/api/python/model.html
+++ b/versions/0.12.0/api/python/model.html
@@ -2170,7 +2170,7 @@ by <a class="reference internal" href="optimization/optimization.html#mxnet.opti
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="optimization/optimization.html#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.0/api/python/optimization.html b/versions/0.12.0/api/python/optimization.html
index 800d05f..b057f0c 100644
--- a/versions/0.12.0/api/python/optimization.html
+++ b/versions/0.12.0/api/python/optimization.html
@@ -811,7 +811,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.0/api/python/optimization/optimization.html b/versions/0.12.0/api/python/optimization/optimization.html
index 75e4df5..3d761c1 100644
--- a/versions/0.12.0/api/python/optimization/optimization.html
+++ b/versions/0.12.0/api/python/optimization/optimization.html
@@ -961,7 +961,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.0/faq/develop_and_hack.html b/versions/0.12.0/faq/develop_and_hack.html
index e1aa782..8393f57 100644
--- a/versions/0.12.0/faq/develop_and_hack.html
+++ b/versions/0.12.0/faq/develop_and_hack.html
@@ -207,7 +207,6 @@
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.12.0/faq/index.html b/versions/0.12.0/faq/index.html
index 40e9495..8c997b1 100644
--- a/versions/0.12.0/faq/index.html
+++ b/versions/0.12.0/faq/index.html
@@ -225,7 +225,6 @@
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html">How do I implement operators in MXNet backend?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -291,7 +290,6 @@ and full working examples, visit the <a class="reference internal" href="../tuto
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html">How do I implement operators in MXNet backend?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/0.12.0/how_to/develop_and_hack.html b/versions/0.12.0/how_to/develop_and_hack.html
index 267c647..cd3b98b 100644
--- a/versions/0.12.0/how_to/develop_and_hack.html
+++ b/versions/0.12.0/how_to/develop_and_hack.html
@@ -169,7 +169,6 @@
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.12.0/how_to/index.html b/versions/0.12.0/how_to/index.html
index de6c925..7a75c63 100644
--- a/versions/0.12.0/how_to/index.html
+++ b/versions/0.12.0/how_to/index.html
@@ -188,7 +188,6 @@
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -251,7 +250,6 @@ and full working examples, visit the <a class="reference internal" href="../tuto
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/0.12.1/api/python/model.html b/versions/0.12.1/api/python/model.html
index d9efe56..73e8bce 100644
--- a/versions/0.12.1/api/python/model.html
+++ b/versions/0.12.1/api/python/model.html
@@ -2170,7 +2170,7 @@ by <a class="reference internal" href="optimization/optimization.html#mxnet.opti
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="optimization/optimization.html#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.1/api/python/optimization.html b/versions/0.12.1/api/python/optimization.html
index 3e6c6ae..ee20509 100644
--- a/versions/0.12.1/api/python/optimization.html
+++ b/versions/0.12.1/api/python/optimization.html
@@ -811,7 +811,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.1/api/python/optimization/optimization.html b/versions/0.12.1/api/python/optimization/optimization.html
index b619578..e6e50a6 100644
--- a/versions/0.12.1/api/python/optimization/optimization.html
+++ b/versions/0.12.1/api/python/optimization/optimization.html
@@ -961,7 +961,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/0.12.1/faq/develop_and_hack.html b/versions/0.12.1/faq/develop_and_hack.html
index 3380cbf..c156f09 100644
--- a/versions/0.12.1/faq/develop_and_hack.html
+++ b/versions/0.12.1/faq/develop_and_hack.html
@@ -207,7 +207,6 @@
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.12.1/faq/index.html b/versions/0.12.1/faq/index.html
index a500ff3..b44be60 100644
--- a/versions/0.12.1/faq/index.html
+++ b/versions/0.12.1/faq/index.html
@@ -225,7 +225,6 @@
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html">How do I implement operators in MXNet backend?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -291,7 +290,6 @@ and full working examples, visit the <a class="reference internal" href="../tuto
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/add_op_in_backend.html">How do I implement operators in MXNet backend?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/0.12.1/how_to/develop_and_hack.html b/versions/0.12.1/how_to/develop_and_hack.html
index abb1e38..ffe4f1b 100644
--- a/versions/0.12.1/how_to/develop_and_hack.html
+++ b/versions/0.12.1/how_to/develop_and_hack.html
@@ -169,7 +169,6 @@
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/0.12.1/how_to/index.html b/versions/0.12.1/how_to/index.html
index c7b49a4..ac83695 100644
--- a/versions/0.12.1/how_to/index.html
+++ b/versions/0.12.1/how_to/index.html
@@ -188,7 +188,6 @@
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>
@@ -251,7 +250,6 @@ and full working examples, visit the <a class="reference internal" href="../tuto
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l1"><a class="reference external" href="https://mxnet.incubator.apache.org/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </div>
 </div>
diff --git a/versions/master/_modules/mxnet/optimizer.html b/versions/master/_modules/mxnet/optimizer.html
index daee3d9..aca10f1 100644
--- a/versions/master/_modules/mxnet/optimizer.html
+++ b/versions/master/_modules/mxnet/optimizer.html
@@ -1381,7 +1381,7 @@
 
 <span class="sd">    Much like Adam is essentially RMSprop with momentum,</span>
 <span class="sd">    Nadam is Adam RMSprop with Nesterov momentum available</span>
-<span class="sd">    at http://cs229.stanford.edu/proj2015/054_report.pdf.</span>
+<span class="sd">    at https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ.</span>
 
 <span class="sd">    This optimizer accepts the following parameters in addition to those accepted</span>
 <span class="sd">    by :class:`.Optimizer`.</span>
diff --git a/versions/master/api/python/model.html b/versions/master/api/python/model.html
index 496a3b9..40404c7 100644
--- a/versions/master/api/python/model.html
+++ b/versions/master/api/python/model.html
@@ -2238,7 +2238,7 @@ by <a class="reference internal" href="optimization/optimization.html#mxnet.opti
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="optimization/optimization.html#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/master/api/python/optimization.html b/versions/master/api/python/optimization.html
index d0a5c53..5fd057d 100644
--- a/versions/master/api/python/optimization.html
+++ b/versions/master/api/python/optimization.html
@@ -863,7 +863,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/master/api/python/optimization/optimization.html b/versions/master/api/python/optimization/optimization.html
index 4683430..4abd9b5 100644
--- a/versions/master/api/python/optimization/optimization.html
+++ b/versions/master/api/python/optimization/optimization.html
@@ -1029,7 +1029,7 @@ by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.
 <dd><p>The Nesterov Adam optimizer.</p>
 <p>Much like Adam is essentially RMSprop with momentum,
 Nadam is Adam RMSprop with Nesterov momentum available
-at <a class="reference external" href="http://cs229.stanford.edu/proj2015/054_report.pdf">http://cs229.stanford.edu/proj2015/054_report.pdf</a>.</p>
+at <a class="reference external" href="https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ">https://openreview.net/pdf?id=OM0jvwB8jIp57ZJjtNEZ</a>.</p>
 <p>This optimizer accepts the following parameters in addition to those accepted
 by <a class="reference internal" href="#mxnet.optimizer.Optimizer" title="mxnet.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">Optimizer</span></code></a>.</p>
 <table class="docutils field-list" frame="void" rules="none">
diff --git a/versions/master/how_to/develop_and_hack.html b/versions/master/how_to/develop_and_hack.html
index 3b09fd2..1597351 100644
--- a/versions/master/how_to/develop_and_hack.html
+++ b/versions/master/how_to/develop_and_hack.html
@@ -198,7 +198,6 @@
 <div class="toctree-wrapper compound">
 <ul>
 <li class="toctree-l1"><a class="reference internal" href="new_op.html">Create new operators</a></li>
-<li class="toctree-l1"><a class="reference internal" href="torch.html">Use Torch from MXNet</a></li>
 <li class="toctree-l1"><a class="reference internal" href="env_var.html">Set environment variables of MXNet</a></li>
 </ul>
 </div>
diff --git a/versions/master/how_to/index.html b/versions/master/how_to/index.html
index 9720ea4..eba1795 100644
--- a/versions/master/how_to/index.html
+++ b/versions/master/how_to/index.html
@@ -215,7 +215,6 @@
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/versions/master/community/contribute.html">How do I contribute a patch to MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/versions/master/how_to/new_op.html">How do I create new operators in MXNet?</a></li>
 <li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/versions/master/how_to/env_var.html">How do I set MXNet’s environmental variables?</a></li>
-<li class="toctree-l3"><a class="reference external" href="https://mxnet.incubator.apache.org/versions/master/how_to/torch.html">How do I use MXNet as a front end for Torch?</a></li>
 </ul>
 </li>
 <li class="toctree-l2"><a class="reference internal" href="#questions-about-using-mxnet">Questions about Using MXNet</a></li>

-- 
To stop receiving notification emails like this one, please contact
['"commits@mxnet.apache.org" <co...@mxnet.apache.org>'].