You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@singa.apache.org by wa...@apache.org on 2016/10/06 07:44:51 UTC
svn commit: r1763509 [1/2] - in /incubator/singa/site/trunk/en:
_sources/develop/schedule.txt _sources/docs/installation.txt
docs/installation.html docs/optimizer.html docs/tensor.html genindex.html
objects.inv searchindex.js
Author: wangwei
Date: Thu Oct 6 07:44:51 2016
New Revision: 1763509
URL: http://svn.apache.org/viewvc?rev=1763509&view=rev
Log:
update API docs and FAQ
Modified:
incubator/singa/site/trunk/en/_sources/develop/schedule.txt
incubator/singa/site/trunk/en/_sources/docs/installation.txt
incubator/singa/site/trunk/en/docs/installation.html
incubator/singa/site/trunk/en/docs/optimizer.html
incubator/singa/site/trunk/en/docs/tensor.html
incubator/singa/site/trunk/en/genindex.html
incubator/singa/site/trunk/en/objects.inv
incubator/singa/site/trunk/en/searchindex.js
Modified: incubator/singa/site/trunk/en/_sources/develop/schedule.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/develop/schedule.txt?rev=1763509&r1=1763508&r2=1763509&view=diff
==============================================================================
--- incubator/singa/site/trunk/en/_sources/develop/schedule.txt (original)
+++ incubator/singa/site/trunk/en/_sources/develop/schedule.txt Thu Oct 6 07:44:51 2016
@@ -20,38 +20,44 @@ Development Schedule
====================
.. csv-table::
- :header: "Release", "Module", "Feature", "Status"
+ :header: "Release","Module","Feature"
- " 0.1 Sep 2015 "," Neural Network "," Feed forward neural network, including CNN, MLP "," done "
- " "," "," RBM-like model, including RBM "," done "
- " "," "," Recurrent neural network, including standard RNN "," done "
- " "," Architecture "," One worker group on single node (with data partition) "," done "
- " "," "," Multi worker groups on single node using [Hogwild](http://www.eecs.berkeley.edu/~brecht/papers/hogwildTR.pdf) ","done"
- " "," "," Distributed Hogwild","done"
- " "," "," Multi groups across nodes, like [Downpour](http://papers.nips.cc/paper/4687-large-scale-distributed-deep-networks) ","done"
- " "," "," All-Reduce training architecture like [DeepImage](http://arxiv.org/abs/1501.02876) ","done"
- " "," "," Load-balance among servers "," done"
- " "," Failure recovery "," Checkpoint and restore ","done"
- " "," Tools "," Installation with GNU auto tools"," done"
- "0.2 Jan 2016 "," Neural Network "," Feed forward neural network, including AlexNet, cuDNN layers, etc."," done "
- " "," "," Recurrent neural network, including GRULayer and BPTT","done "
- " "," "," Model partition and hybrid partition","done"
- " "," Tools "," Integration with Mesos for resource management","done"
- " "," "," Prepare Docker images for deployment","done"
- " "," "," Visualization of neural net and debug information ","done"
- " "," Binding "," Python binding for major components ","done"
- " "," GPU "," Single node with multiple GPUs ","done"
- "0.3 April 2016 "," GPU "," Multiple nodes, each with multiple GPUs","done"
- " "," "," Heterogeneous training using both GPU and CPU [CcT](http://arxiv.org/abs/1504.04343)","done"
- " "," "," Support cuDNN v4 "," done"
- " "," Installation "," Remove dependency on ZeroMQ, CZMQ, Zookeeper for single node training","done"
- " "," Updater "," Add new SGD updaters including Adam, AdamMax and AdaDelta","done"
- " "," Binding "," Enhance Python binding for training","done"
- "1.0 Sep 2016 "," Programming abstraction ","Tensor with linear algebra, neural net and random operations "," "
- " "," ","Updater for distributed parameter updating ",""
- " "," Hardware "," Use Cuda and Cudnn for Nvidia GPU",""
- " "," "," Use OpenCL for AMD GPU or other devices",""
- " "," Cross-platform "," To extend from Linux to MacOS",""
- " "," Examples "," Speech recognition example",""
- " "," ","Large image models, e.g., [VGG](https://arxiv.org/pdf/1409.1556.pdf) and [Residual Net](http://arxiv.org/abs/1512.03385)",""
- "1.1 Dec 2016 "," ",""," "
+ "0.1 Sep 2015 ","Neural Network ","Feed forward neural network, including CNN, MLP "
+ " "," ","RBM-like model, including RBM "
+ " "," ","Recurrent neural network, including standard RNN "
+ " ","Architecture ","One worker group on single node (with data partition) "
+ " "," ","Multi worker groups on single node using `Hogwild <http://www.eecs.berkeley.edu/~brecht/papers/hogwildTR.pdf>`_ "
+ " "," ","Distributed Hogwild"
+ " "," ","Multi groups across nodes, like `Downpour <http://papers.nips.cc/paper/4687-large-scale-distributed-deep-networks>`_"
+ " "," ","All-Reduce training architecture like `DeepImage <http://arxiv.org/abs/1501.02876>`_ "
+ " "," ","Load-balance among servers "
+ " ","Failure recovery ","Checkpoint and restore "
+ " ","Tools ","Installation with GNU auto Tools "
+ "0.2 Jan 2016 ","Neural Network ","Feed forward neural network, including AlexNet, cuDNN layers,Tools "
+ " "," ","Recurrent neural network, including GRULayer and BPTT "
+ " "," ","Model partition and hybrid partition "
+ " ","Tools ","Integration with Mesos for resource management "
+ " "," ","Prepare Docker images for deployment"
+ " "," ","Visualization of neural net and debug information "
+ " ","Binding ","Python binding for major components "
+ " ","GPU ","Single node with multiple GPUs "
+ "0.3 April 2016 ","GPU ","Multiple nodes, each with multiple GPUs"
+ " "," ","Heterogeneous training using both GPU and CPU `CcT <http://arxiv.org/abs/1504.04343>`_"
+ " "," ","Support cuDNN v4 "
+ " ","Installation ","Remove dependency on ZeroMQ, CZMQ, Zookeeper for single node training"
+ " ","Updater ","Add new SGD updaters including Adam, AdamMax and AdaDelta"
+ " ","Binding ","Enhance Python binding for training"
+ "1.0 Sep 2016 ","Programming abstraction ","Tensor with linear algebra, neural net and random operations "
+ " "," ","Updater for distributed parameter updating "
+ " ","Hardware ","Use Cuda and Cudnn for Nvidia GPU"
+ " "," ","Use OpenCL for AMD GPU or other devices"
+ " ","Cross-platform ","To extend from Linux to MacOS"
+ " "," ","Large image models, e.g., `VGG <https://arxiv.org/pdf/1409.1556.pdf>`_ and `Residual Net <http://arxiv.org/abs/1512.03385>`_"
+ "1.1 Dec 2016 ","Model Zoo ","Health-care models and popular image models"
+ " ","Caffe converter ","Use SINGA to train models configured in caffe proto files"
+ " ","Memory optimization ","Replace CNMEM with new memory pool to reduce memory footprint"
+ " ","Distributed training ","Migrate distributed training frameworks from V0.3"
+ " ","Compilation and installation ","Windows suppport"
+ " "," ","Simplify the installation by compiling protobuf and openblas together with SINGA"
+ " "," ","Build python wheel automatically using Jenkins"
+ " "," ","Deploy SINGA programs on Android phones for prediction tasks"
Modified: incubator/singa/site/trunk/en/_sources/docs/installation.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/installation.txt?rev=1763509&r1=1763508&r2=1763509&view=diff
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/installation.txt (original)
+++ incubator/singa/site/trunk/en/_sources/docs/installation.txt Thu Oct 6 07:44:51 2016
@@ -37,7 +37,7 @@ The following instructions are tested on
# optional libraries
$ sudo apt-get install python2.7-dev python-pip python-numpy
- $ sudo apt-get install llibopencv-dev ibgoogle-glog-dev liblmdb-dev
+ $ sudo apt-get install libopencv-dev libgoogle-glog-dev liblmdb-dev
Please note that PySINGA requires swig >=3.0, which could be installed via
apt-get on Ubuntu 16.04; but it has to be installed from source for other Ubuntu versions including 14.04.
@@ -68,6 +68,7 @@ To let the runtime know the openblas pat
### pip and anaconda for PySINGA
pip and anaconda could be used to install python packages, e.g. numpy.
+Python virtual environment is recommended to run PySINGA.
To use pip with virtual environment,
# install virtualenv
@@ -219,6 +220,30 @@ To be added.
## FAQ
+* Q: Error from 'import singa' using PySINGA installed from wheel.
+
+ A: Please check the detailed error from `python -c "from singa import _singa_wrap"`. Sometimes it is
+ caused by the dependent libraries, e.g. there are multiple versions of protobuf or missing of cudnn. Following
+ steps show the solutions for different cases
+ 1. check the cudnn and cuda and gcc versions, cudnn5 and cuda7.5 and gcc4.8/4.9 are preferred. if gcc is 5.0, then downgrade it.
+ if cudnn is missing or not match with the wheel version, you can download the correct version of cudnn into ~/local/cudnn/ and
+ ```
+ echo "export LD_LIBRARY_PATH=/home/<yourname>/local/cudnn/lib64:$LD_LIBRARY_PATH" >> ~/.bashrc
+ ```
+ 2. if it is the problem related to protobuf, then better install protobuf from source into a local folder, say ~/local/;
+ Decompress the tar file, and then
+ ```
+ ./configure --prefix=/home/<yourname>local
+ make && make install
+ echo "export LD_LIBRARY_PATH=/home/<yourname>/local/lib:$LD_LIBRARY_PATH" >> ~/.bashrc
+ source ~/.bashrc
+ 3. if it cannot find other libs including python, then please create virtual env using pip or conda;
+ and then install SINGA via
+ ```
+ pip install --upgrade <url of singa wheel>
+ ```
+
+
* Q: Error from running `cmake ..`, which cannot find the dependent libraries.
A: If you haven't installed the libraries, please install them. If you installed
@@ -276,7 +301,7 @@ To be added.
* Q: When I build protocol buffer, it reports that GLIBC++_3.4.20 not found in /usr/lib64/libstdc++.so.6.
- A9: This means the linker found libstdc++.so.6 but that library
+ A: This means the linker found libstdc++.so.6 but that library
belongs to an older version of GCC than was used to compile and link the
program. The program depends on code defined in
the newer libstdc++ that belongs to the newer version of GCC, so the linker
Modified: incubator/singa/site/trunk/en/docs/installation.html
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/docs/installation.html?rev=1763509&r1=1763508&r2=1763509&view=diff
==============================================================================
--- incubator/singa/site/trunk/en/docs/installation.html (original)
+++ incubator/singa/site/trunk/en/docs/installation.html Thu Oct 6 07:44:51 2016
@@ -225,7 +225,7 @@ $ sudo apt-get install libprotobuf-dev l
# optional libraries
$ sudo apt-get install python2.7-dev python-pip python-numpy
-$ sudo apt-get install llibopencv-dev ibgoogle-glog-dev liblmdb-dev
+$ sudo apt-get install libopencv-dev libgoogle-glog-dev liblmdb-dev
</pre></div>
</div>
<p>Please note that PySINGA requires swig >=3.0, which could be installed via
@@ -410,7 +410,31 @@ Remember to add its directory to <code c
<ul>
<li><p class="first">Q: Error from ‘import singa’ using PySINGA installed from wheel.</p>
<p>A: Please check the detailed error from <code class="docutils literal"><span class="pre">python</span> <span class="pre">-c</span> <span class="pre">"from</span> <span class="pre">singa</span> <span class="pre">import</span> <span class="pre">_singa_wrap"</span></code>. Sometimes it is
-caused by the dependent libraries, e.g. there are multiple versions of protobuf.</p>
+caused by the dependent libraries, e.g. there are multiple versions of protobuf or missing of cudnn. Following
+steps show the solutions for different cases</p>
+<ol>
+<li><p class="first">check the cudnn and cuda and gcc versions, cudnn5 and cuda7.5 and gcc4.8/4.9 are preferred. if gcc is 5.0, then downgrade it.
+if cudnn is missing or not match with the wheel version, you can download the correct version of cudnn into ~/local/cudnn/ and</p>
+<div class="highlight-default"><div class="highlight"><pre><span class="n">echo</span> <span class="s">"export LD_LIBRARY_PATH=/home/<yourname>/local/cudnn/lib64:$LD_LIBRARY_PATH"</span> <span class="o">>></span> <span class="o">~/.</span><span class="n">bashrc</span>
+</pre></div>
+</div>
+</li>
+<li><p class="first">if it is the problem related to protobuf, then better install protobuf from source into a local folder, say ~/local/;
+Decompress the tar file, and then</p>
+<div class="highlight-default"><div class="highlight"><pre><span class="o">./</span><span class="n">configure</span> <span class="o">--</span><span class="n">prefix</span><span class="o">=/</span><span class="n">home</span><span class="o">/<</span><span class="n">yourname</span><span class="o">></span><span class="n">local</span>
+<span class="n">make</span> <span class="o">&&</span> <span class="n">make</span> <span class="n">install</span>
+<span class="n">echo</span> <span class="s">"export LD_LIBRARY_PATH=/home/<yourname>/local/lib:$LD_LIBRARY_PATH"</span> <span class="o">>></span> <span class="o">~/.</span><span class="n">bashrc</span>
+<span class="n">source</span> <span class="o">~/.</span><span class="n">bashrc</span>
+</pre></div>
+</div>
+</li>
+<li><p class="first">if it cannot find other libs including python, then please create virtual env using pip or conda;
+and then install SINGA via</p>
+<div class="highlight-default"><div class="highlight"><pre><span class="n">pip</span> <span class="n">install</span> <span class="o">--</span><span class="n">upgrade</span> <span class="o"><</span><span class="n">url</span> <span class="n">of</span> <span class="n">singa</span> <span class="n">wheel</span><span class="o">></span>
+</pre></div>
+</div>
+</li>
+</ol>
</li>
</ul>
<ul>
Modified: incubator/singa/site/trunk/en/docs/optimizer.html
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/docs/optimizer.html?rev=1763509&r1=1763508&r2=1763509&view=diff
==============================================================================
--- incubator/singa/site/trunk/en/docs/optimizer.html (original)
+++ incubator/singa/site/trunk/en/docs/optimizer.html Thu Oct 6 07:44:51 2016
@@ -180,7 +180,7 @@
</div>
<dl class="class">
<dt id="singa.optimizer.Optimizer">
-<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">Optimizer</code><span class="sig-paren">(</span><em>lr=None</em>, <em>momentum=None</em>, <em>weight_decay=None</em>, <em>lr_gen=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">Optimizer</code><span class="sig-paren">(</span><em>lr=None</em>, <em>momentum=None</em>, <em>weight_decay=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer" title="Permalink to this definition">¶</a></dt>
<dd><p>Bases: <code class="xref py py-class docutils literal"><span class="pre">object</span></code></p>
<p>The base python optimizer class.</p>
<p>Typically, an optimizer is used as follows:</p>
@@ -197,21 +197,16 @@ parameter udpate.</p>
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
-<li><strong>lr</strong> (<em>float</em>) – a constant for the learning rate, mutually exclusive with
-‘lr_gen’.</li>
-<li><strong>momentum</strong> (<em>float</em>) – a constant for the momentum value</li>
+<li><strong>lr</strong> (<em>float</em>) – a constant value for the learning rate</li>
+<li><strong>momentum</strong> (<em>float</em>) – a constant value for the momentum value</li>
<li><strong>weight_decay</strong> (<em>float</em>) – the coefficent for L2 regularizer, which is
mutually exclusive with ‘regularizer’.</li>
-<li><strong>lr_gen</strong> (<em>function</em>) – a function returns the learning rate given
-the current training step/epoch. It is mutually exclusive with lr.
-If both are not set, the apply_with_lr function should be used for
-param updating.</li>
<li><strong>regularizer</strong> – an instance of Regularizer or RegularizerConf; If set,
regularization would be applied in apply_with_lr().
Users can also do regularization outside.</li>
<li><strong>constraint</strong> – an instance of Constraint or ConstraintConf; If set,
constraint would be applied inside apply_with_lr(). Users can
-also do regularization outside.</li>
+also apply constraint outside.</li>
</ul>
</td>
</tr>
@@ -222,7 +217,9 @@ also do regularization outside.</li>
<code class="descname">register</code><span class="sig-paren">(</span><em>name</em>, <em>specs</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer.register" title="Permalink to this definition">¶</a></dt>
<dd><p>Register the param specs, including creating regularizer and
constraint per param object. Param specific regularizer and constraint
-have higher priority than the global ones.</p>
+have higher priority than the global ones. If all parameters share the
+same setting for learning rate, regularizer and constraint, then there
+is no need to call this function.</p>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
@@ -240,7 +237,7 @@ constraint, multipliers for learning rat
<dl class="method">
<dt id="singa.optimizer.Optimizer.apply_regularizer_constraint">
-<code class="descname">apply_regularizer_constraint</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>name=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer.apply_regularizer_constraint" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply_regularizer_constraint</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>name=None</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer.apply_regularizer_constraint" title="Permalink to this definition">¶</a></dt>
<dd><p>Apply regularization and constraint if available.</p>
<p>If there are both global regularizer (constraint) and param specific
regularizer (constraint), it would use the param specific one.</p>
@@ -249,10 +246,11 @@ regularizer (constraint), it would use t
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
+<li><strong>epoch</strong> (<em>int</em>) – training epoch ID</li>
<li><strong>value</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) – parameter value Tensor</li>
<li><strong>grad</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) – parameter gradient Tensor</li>
<li><strong>name</strong> (<em>string</em>) – to get parameter specific regularizer or constraint</li>
-<li><strong>epoch</strong> (<em>int</em>) – some regularizer or constraint would use epoch</li>
+<li><strong>step</strong> (<em>int</em>) – iteration ID within one epoch</li>
</ul>
</td>
</tr>
@@ -265,20 +263,23 @@ regularizer (constraint), it would use t
<dl class="method">
<dt id="singa.optimizer.Optimizer.apply_with_lr">
-<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer.apply_with_lr" title="Permalink to this definition">¶</a></dt>
-<dd><p>Do update with given learning rate.</p>
-<p>The subclass optimizer must override this function.</p>
+<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name=None</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer.apply_with_lr" title="Permalink to this definition">¶</a></dt>
+<dd><p>Do update of parameters with given learning rate if the grad is not
+empty.</p>
+<p>The subclass optimizer must override this function.
+This function do nothing if the grad is empty.</p>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
-<li><strong>epoch</strong> (<em>int</em>) – training epoch (could be iteration or epoch)</li>
+<li><strong>epoch</strong> (<em>int</em>) – training epoch ID</li>
<li><strong>lr</strong> (<em>float</em>) – learning rate</li>
<li><strong>grad</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) – parameter gradient</li>
<li><strong>value</strong> (<em>Tesnor</em>) – parameter value</li>
-<li><strong>name</strong> (<em>string</em>) – paramter name to retrieval parameter specific
+<li><strong>name</strong> (<em>string</em>) – paramter name to index parameter specific
updating rules (including regularizer and constraint)</li>
+<li><strong>step</strong> (<em>int</em>) – iteration ID within one epoch</li>
</ul>
</td>
</tr>
@@ -291,7 +292,7 @@ updating rules (including regularizer an
<dl class="method">
<dt id="singa.optimizer.Optimizer.apply">
-<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>grad</em>, <em>value</em>, <em>name=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer.apply" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>grad</em>, <em>value</em>, <em>name=None</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Optimizer.apply" title="Permalink to this definition">¶</a></dt>
<dd><p>Do update assuming the learning rate generator is set.</p>
<p>The subclass optimizer does not need to override this function.</p>
<table class="docutils field-list" frame="void" rules="none">
@@ -299,11 +300,12 @@ updating rules (including regularizer an
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first simple">
-<li><strong>epoch</strong> (<em>int</em>) – training epoch (could be iteration or epoch)</li>
+<li><strong>epoch</strong> (<em>int</em>) – training epoch ID</li>
<li><strong>grad</strong> (<a class="reference internal" href="tensor.html#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) – parameter gradient</li>
<li><strong>value</strong> (<em>Tesnor</em>) – parameter value</li>
<li><strong>name</strong> (<em>string</em>) – paramter name to retrieval parameter specific
updating rules (including regularizer and constraint)</li>
+<li><strong>step</strong> (<em>int</em>) – training iteration ID within one epoch</li>
</ul>
</td>
</tr>
@@ -318,26 +320,51 @@ updating rules (including regularizer an
<dl class="class">
<dt id="singa.optimizer.SGD">
-<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">SGD</code><span class="sig-paren">(</span><em>lr=None</em>, <em>momentum=None</em>, <em>weight_decay=None</em>, <em>lr_gen=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.SGD" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">SGD</code><span class="sig-paren">(</span><em>lr=None</em>, <em>momentum=None</em>, <em>weight_decay=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.SGD" title="Permalink to this definition">¶</a></dt>
<dd><p>Bases: <a class="reference internal" href="#singa.optimizer.Optimizer" title="singa.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">singa.optimizer.Optimizer</span></code></a></p>
<p>The vallina Stochasitc Gradient Descent algorithm with momentum.</p>
<p>See the base Optimizer for all arguments.</p>
<dl class="method">
<dt id="singa.optimizer.SGD.apply_with_lr">
-<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.SGD.apply_with_lr" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.SGD.apply_with_lr" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
<dl class="class">
<dt id="singa.optimizer.Nesterov">
-<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">Nesterov</code><span class="sig-paren">(</span><em>lr=None</em>, <em>momentum=0.9</em>, <em>weight_decay=None</em>, <em>lr_gen=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Nesterov" title="Permalink to this definition">¶</a></dt>
+<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">Nesterov</code><span class="sig-paren">(</span><em>lr=None</em>, <em>momentum=0.9</em>, <em>weight_decay=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Nesterov" title="Permalink to this definition">¶</a></dt>
<dd><p>Bases: <a class="reference internal" href="#singa.optimizer.Optimizer" title="singa.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">singa.optimizer.Optimizer</span></code></a></p>
<p>The SGD with Nesterov momentum.</p>
<p>See the base Optimizer for all arguments.</p>
<dl class="method">
<dt id="singa.optimizer.Nesterov.apply_with_lr">
-<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Nesterov.apply_with_lr" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Nesterov.apply_with_lr" title="Permalink to this definition">¶</a></dt>
+<dd></dd></dl>
+
+</dd></dl>
+
+<dl class="class">
+<dt id="singa.optimizer.RMSProp">
+<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">RMSProp</code><span class="sig-paren">(</span><em>rho=0.9</em>, <em>epsilon=1e-08</em>, <em>lr=None</em>, <em>weight_decay=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.RMSProp" title="Permalink to this definition">¶</a></dt>
+<dd><p>Bases: <a class="reference internal" href="#singa.optimizer.Optimizer" title="singa.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">singa.optimizer.Optimizer</span></code></a></p>
+<p>RMSProp optimizer.</p>
+<p>See the base Optimizer for all constructor args.</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
+<li><strong>rho</strong> (<em>float</em>) – float within [0, 1]</li>
+<li><strong>epsilon</strong> (<em>float</em>) – small value for preventing numeric error</li>
+</ul>
+</td>
+</tr>
+</tbody>
+</table>
+<dl class="method">
+<dt id="singa.optimizer.RMSProp.apply_with_lr">
+<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.RMSProp.apply_with_lr" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
@@ -358,23 +385,24 @@ updating rules (including regularizer an
</table>
<dl class="method">
<dt id="singa.optimizer.AdaGrad.apply_with_lr">
-<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.AdaGrad.apply_with_lr" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.AdaGrad.apply_with_lr" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
<dl class="class">
-<dt id="singa.optimizer.RMSProp">
-<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">RMSProp</code><span class="sig-paren">(</span><em>rho=0.9</em>, <em>epsilon=1e-08</em>, <em>lr=None</em>, <em>weight_decay=None</em>, <em>lr_gen=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.RMSProp" title="Permalink to this definition">¶</a></dt>
+<dt id="singa.optimizer.Adam">
+<em class="property">class </em><code class="descclassname">singa.optimizer.</code><code class="descname">Adam</code><span class="sig-paren">(</span><em>beta_1=0.9</em>, <em>beta_2=0.999</em>, <em>epsilon=1e-08</em>, <em>lr=None</em>, <em>weight_decay=None</em>, <em>regularizer=None</em>, <em>constraint=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Adam" title="Permalink to this definition">¶</a></dt>
<dd><p>Bases: <a class="reference internal" href="#singa.optimizer.Optimizer" title="singa.optimizer.Optimizer"><code class="xref py py-class docutils literal"><span class="pre">singa.optimizer.Optimizer</span></code></a></p>
-<p>RMSProp optimizer.</p>
+<p>Adam optimizer.</p>
<p>See the base Optimizer for all constructor args.</p>
<table class="docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><ul class="first last simple">
-<li><strong>rho</strong> (<em>float</em>) – float within [0, 1]</li>
+<li><strong>beta_1</strong> (<em>float</em>) – coefficient of momentum</li>
+<li><strong>beta_2</strong> (<em>float</em>) – coefficient of aggregated squared gradient</li>
<li><strong>epsilon</strong> (<em>float</em>) – small value for preventing numeric error</li>
</ul>
</td>
@@ -382,9 +410,18 @@ updating rules (including regularizer an
</tbody>
</table>
<dl class="method">
-<dt id="singa.optimizer.RMSProp.apply_with_lr">
-<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.RMSProp.apply_with_lr" title="Permalink to this definition">¶</a></dt>
-<dd></dd></dl>
+<dt id="singa.optimizer.Adam.apply_with_lr">
+<code class="descname">apply_with_lr</code><span class="sig-paren">(</span><em>epoch</em>, <em>lr</em>, <em>grad</em>, <em>value</em>, <em>name</em>, <em>step</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Adam.apply_with_lr" title="Permalink to this definition">¶</a></dt>
+<dd><p>Update one parameter object.</p>
+<table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>step</strong> (<em>int</em>) – the accumulated training iterations, not the iteration ID</td>
+</tr>
+</tbody>
+</table>
+</dd></dl>
</dd></dl>
@@ -395,7 +432,7 @@ updating rules (including regularizer an
<p>Base Python regularizer for parameter gradients.</p>
<dl class="method">
<dt id="singa.optimizer.Regularizer.apply">
-<code class="descname">apply</code><span class="sig-paren">(</span><em>value</em>, <em>grad</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Regularizer.apply" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Regularizer.apply" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
@@ -415,7 +452,7 @@ updating rules (including regularizer an
</table>
<dl class="method">
<dt id="singa.optimizer.CppRegularizer.apply">
-<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.CppRegularizer.apply" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.CppRegularizer.apply" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
@@ -435,7 +472,7 @@ updating rules (including regularizer an
</table>
<dl class="method">
<dt id="singa.optimizer.L2Regularizer.apply">
-<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>coefficient=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.L2Regularizer.apply" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.L2Regularizer.apply" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
@@ -447,7 +484,7 @@ updating rules (including regularizer an
<p>Base Python constraint class for paramter gradients</p>
<dl class="method">
<dt id="singa.optimizer.Constraint.apply">
-<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Constraint.apply" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.Constraint.apply" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
@@ -467,7 +504,7 @@ updating rules (including regularizer an
</table>
<dl class="method">
<dt id="singa.optimizer.CppConstraint.apply">
-<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.CppConstraint.apply" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.CppConstraint.apply" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
@@ -479,7 +516,7 @@ updating rules (including regularizer an
<p>Rescale the gradient to make the L2 norm <= a given threshold</p>
<dl class="method">
<dt id="singa.optimizer.L2Constraint.apply">
-<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>threshold=None</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.L2Constraint.apply" title="Permalink to this definition">¶</a></dt>
+<code class="descname">apply</code><span class="sig-paren">(</span><em>epoch</em>, <em>value</em>, <em>grad</em>, <em>step=-1</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.optimizer.L2Constraint.apply" title="Permalink to this definition">¶</a></dt>
<dd></dd></dl>
</dd></dl>
Modified: incubator/singa/site/trunk/en/docs/tensor.html
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/docs/tensor.html?rev=1763509&r1=1763508&r2=1763509&view=diff
==============================================================================
--- incubator/singa/site/trunk/en/docs/tensor.html (original)
+++ incubator/singa/site/trunk/en/docs/tensor.html Thu Oct 6 07:44:51 2016
@@ -407,6 +407,19 @@ but is marked as a transposed version of
</dd></dl>
<dl class="method">
+<dt id="singa.tensor.Tensor.is_empty">
+<code class="descname">is_empty</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.tensor.Tensor.is_empty" title="Permalink to this definition">¶</a></dt>
+<dd><table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Returns:</th><td class="field-body">True if the tensor is empty according to its shape</td>
+</tr>
+</tbody>
+</table>
+</dd></dl>
+
+<dl class="method">
<dt id="singa.tensor.Tensor.is_transpose">
<code class="descname">is_transpose</code><span class="sig-paren">(</span><span class="sig-paren">)</span><a class="headerlink" href="#singa.tensor.Tensor.is_transpose" title="Permalink to this definition">¶</a></dt>
<dd><table class="docutils field-list" frame="void" rules="none">
@@ -1118,6 +1131,21 @@ old shape.</li>
</tr>
</tbody>
</table>
+</dd></dl>
+
+<dl class="function">
+<dt id="singa.tensor.sqrt">
+<code class="descclassname">singa.tensor.</code><code class="descname">sqrt</code><span class="sig-paren">(</span><em>t</em><span class="sig-paren">)</span><a class="headerlink" href="#singa.tensor.sqrt" title="Permalink to this definition">¶</a></dt>
+<dd><table class="docutils field-list" frame="void" rules="none">
+<col class="field-name" />
+<col class="field-body" />
+<tbody valign="top">
+<tr class="field-odd field"><th class="field-name">Parameters:</th><td class="field-body"><strong>t</strong> (<a class="reference internal" href="#singa.tensor.Tensor" title="singa.tensor.Tensor"><em>Tensor</em></a>) – input Tensor</td>
+</tr>
+<tr class="field-even field"><th class="field-name">Returns:</th><td class="field-body">a new Tensor whose element y = sqrt(x), x is an element of t</td>
+</tr>
+</tbody>
+</table>
</dd></dl>
<dl class="function">
Modified: incubator/singa/site/trunk/en/genindex.html
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/genindex.html?rev=1763509&r1=1763508&r2=1763509&view=diff
==============================================================================
--- incubator/singa/site/trunk/en/genindex.html (original)
+++ incubator/singa/site/trunk/en/genindex.html Thu Oct 6 07:44:51 2016
@@ -188,6 +188,10 @@
</dt>
+ <dt><a href="docs/optimizer.html#singa.optimizer.Adam">Adam (class in singa.optimizer)</a>
+ </dt>
+
+
<dt><a href="docs/tensor.html#singa.tensor.add">add() (in module singa.tensor)</a>
</dt>
@@ -211,6 +215,8 @@
</dt>
</dl></dd>
+ </dl></td>
+ <td style="width: 33%" valign="top"><dl>
<dt><a href="docs/optimizer.html#singa.optimizer.Constraint.apply">apply() (singa.optimizer.Constraint method)</a>
</dt>
@@ -241,8 +247,6 @@
</dt>
</dl></dd>
- </dl></td>
- <td style="width: 33%" valign="top"><dl>
<dt><a href="docs/optimizer.html#singa.optimizer.Optimizer.apply_regularizer_constraint">apply_regularizer_constraint() (singa.optimizer.Optimizer method)</a>
</dt>
@@ -253,6 +257,10 @@
<dd><dl>
+ <dt><a href="docs/optimizer.html#singa.optimizer.Adam.apply_with_lr">(singa.optimizer.Adam method)</a>
+ </dt>
+
+
<dt><a href="docs/optimizer.html#singa.optimizer.Nesterov.apply_with_lr">(singa.optimizer.Nesterov method)</a>
</dt>
@@ -580,6 +588,12 @@
<table style="width: 100%" class="indextable genindextable"><tr>
<td style="width: 33%" valign="top"><dl>
+ <dt><a href="docs/tensor.html#singa.tensor.Tensor.is_empty">is_empty() (singa.tensor.Tensor method)</a>
+ </dt>
+
+ </dl></td>
+ <td style="width: 33%" valign="top"><dl>
+
<dt><a href="docs/tensor.html#singa.tensor.Tensor.is_transpose">is_transpose() (singa.tensor.Tensor method)</a>
</dt>
@@ -831,12 +845,12 @@
<dt><a href="docs/utils.html#module-singa.utils">singa.utils (module)</a>
</dt>
- </dl></td>
- <td style="width: 33%" valign="top"><dl>
<dt><a href="docs/tensor.html#singa.tensor.Tensor.size">size() (singa.tensor.Tensor method)</a>
</dt>
+ </dl></td>
+ <td style="width: 33%" valign="top"><dl>
<dt><a href="docs/tensor.html#singa.tensor.sizeof">sizeof() (in module singa.tensor)</a>
</dt>
@@ -858,6 +872,10 @@
</dt>
+ <dt><a href="docs/tensor.html#singa.tensor.sqrt">sqrt() (in module singa.tensor)</a>
+ </dt>
+
+
<dt><a href="docs/tensor.html#singa.tensor.square">square() (in module singa.tensor)</a>
</dt>
Modified: incubator/singa/site/trunk/en/objects.inv
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/objects.inv?rev=1763509&r1=1763508&r2=1763509&view=diff
==============================================================================
Binary files - no diff available.