You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@singa.apache.org by wa...@apache.org on 2017/02/25 15:30:35 UTC
svn commit: r1784386 [2/19] - in /incubator/singa/site/trunk: en/
en/_sources/ en/_sources/community/ en/_sources/develop/ en/_sources/docs/
en/_sources/docs/model_zoo/ en/_sources/docs/model_zoo/caffe/
en/_sources/docs/model_zoo/char-rnn/ en/_sources/...
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/imagenet/googlenet/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/imagenet/googlenet/README.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/imagenet/googlenet/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/imagenet/googlenet/README.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,66 @@
+---
+name: GoogleNet on ImageNet
+SINGA version: 1.0.1
+SINGA commit: 8c990f7da2de220e8a012c6a8ecc897dc7532744
+parameter_url: https://s3-ap-southeast-1.amazonaws.com/dlfile/bvlc_googlenet.tar.gz
+parameter_sha1: 0a88e8948b1abca3badfd8d090d6be03f8d7655d
+license: unrestricted https://github.com/BVLC/caffe/tree/master/models/bvlc_googlenet
+---
+
+# Image Classification using GoogleNet
+
+
+In this example, we convert GoogleNet trained on Caffe to SINGA for image classification.
+
+## Instructions
+
+* Download the parameter checkpoint file into this folder
+
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/bvlc_googlenet.tar.gz
+ $ tar xvf bvlc_googlenet.tar.gz
+
+* Run the program
+
+ # use cpu
+ $ python serve.py -C &
+ # use gpu
+ $ python serve.py &
+
+* Submit images for classification
+
+ $ curl -i -F image=@image1.jpg http://localhost:9999/api
+ $ curl -i -F image=@image2.jpg http://localhost:9999/api
+ $ curl -i -F image=@image3.jpg http://localhost:9999/api
+
+image1.jpg, image2.jpg and image3.jpg should be downloaded before executing the above commands.
+
+## Details
+
+We first extract the parameter values from [Caffe's checkpoint file](http://dl.caffe.berkeleyvision.org/bvlc_googlenet.caffemodel) into a pickle version
+After downloading the checkpoint file into `caffe_root/python` folder, run the following script
+
+ # to be executed within caffe_root/python folder
+ import caffe
+ import numpy as np
+ import cPickle as pickle
+
+ model_def = '../models/bvlc_googlenet/deploy.prototxt'
+ weight = 'bvlc_googlenet.caffemodel' # must be downloaded at first
+ net = caffe.Net(model_def, weight, caffe.TEST)
+
+ params = {}
+ for layer_name in net.params.keys():
+ weights=np.copy(net.params[layer_name][0].data)
+ bias=np.copy(net.params[layer_name][1].data)
+ params[layer_name+'_weight']=weights
+ params[layer_name+'_bias']=bias
+ print layer_name, weights.shape, bias.shape
+
+ with open('bvlc_googlenet.pickle', 'wb') as fd:
+ pickle.dump(params, fd)
+
+Then we construct the GoogleNet using SINGA's FeedForwardNet structure.
+Note that we added a EndPadding layer to resolve the issue from discrepancy
+of the rounding strategy of the pooling layer between Caffe (ceil) and cuDNN (floor).
+Only the MaxPooling layers outside inception blocks have this problem.
+Refer to [this](http://joelouismarino.github.io/blog_posts/blog_googlenet_keras.html) for more detials.
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/imagenet/resnet/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/imagenet/resnet/README.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/imagenet/resnet/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/imagenet/resnet/README.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,54 @@
+---
+name: Resnets on ImageNet
+SINGA version: 1.1
+SINGA commit: 45ec92d8ffc1fa1385a9307fdf07e21da939ee2f
+parameter_url: https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-18.tar.gz
+license: Apache V2, https://github.com/facebook/fb.resnet.torch/blob/master/LICENSE
+---
+
+# Image Classification using Residual Networks
+
+
+In this example, we convert Residual Networks trained on [Torch](https://github.com/facebook/fb.resnet.torch) to SINGA for image classification.
+
+## Instructions
+
+* Download one parameter checkpoint file (see below) and the synset word file of ImageNet into this folder, e.g.,
+
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-18.tar.gz
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/synset_words.txt
+ $ tar xvf resnet-18.tar.gz
+
+* Usage
+
+ $ python serve.py -h
+
+* Example
+
+ # use cpu
+ $ python serve.py --use_cpu --parameter_file resnet-18.pickle --model resnet --depth 18 &
+ # use gpu
+ $ python serve.py --parameter_file resnet-18.pickle --model resnet --depth 18 &
+
+ The parameter files for the following model and depth configuration pairs are provided:
+ * resnet (original resnet), [18](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-101.tar.gz)|[34](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-34.tar.gz)|[101](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-101.tar.gz)|[152](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-152.tar.gz)
+ * addbn (resnet with a batch normalization layer after the addition), [50](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-50.tar.gz)
+ * wrn (wide resnet), [50](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/wrn-50-2.tar.gz)
+ * preact (resnet with pre-activation) [200](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-200.tar.gz)
+
+* Submit images for classification
+
+ $ curl -i -F image=@image1.jpg http://localhost:9999/api
+ $ curl -i -F image=@image2.jpg http://localhost:9999/api
+ $ curl -i -F image=@image3.jpg http://localhost:9999/api
+
+image1.jpg, image2.jpg and image3.jpg should be downloaded before executing the above commands.
+
+## Details
+
+The parameter files were extracted from the original [torch files](https://github.com/facebook/fb.resnet.torch/tree/master/pretrained) via
+the convert.py program.
+
+Usage:
+
+ $ python convert.py -h
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/index.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/index.rst.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/index.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/index.rst.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,29 @@
+..
+.. Licensed to the Apache Software Foundation (ASF) under one
+.. or more contributor license agreements. See the NOTICE file
+.. distributed with this work for additional information
+.. regarding copyright ownership. The ASF licenses this file
+.. to you under the Apache License, Version 2.0 (the
+.. "License"); you may not use this file except in compliance
+.. with the License. You may obtain a copy of the License at
+..
+.. http://www.apache.org/licenses/LICENSE-2.0
+..
+.. Unless required by applicable law or agreed to in writing, software
+.. distributed under the License is distributed on an "AS IS" BASIS,
+.. WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+.. See the License for the specific language governing permissions and
+.. limitations under the License.
+..
+
+Model Zoo
+=========
+
+.. toctree::
+
+ cifar10/README
+ char-rnn/README
+ imagenet/alexnet/README
+ imagenet/googlenet/README
+
+
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/mnist/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/mnist/README.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/mnist/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/mnist/README.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,18 @@
+# Train a RBM model against MNIST dataset
+
+This example is to train an RBM model using the
+MNIST dataset. The RBM model and its hyper-parameters are set following
+[Hinton's paper](http://www.cs.toronto.edu/~hinton/science.pdf)
+
+## Running instructions
+
+1. Download the pre-processed [MNIST dataset](https://github.com/mnielsen/neural-networks-and-deep-learning/raw/master/data/mnist.pkl.gz)
+
+2. Start the training
+
+ python train.py mnist.pkl.gz
+
+By default the training code would run on CPU. To run it on a GPU card, please start
+the program with an additional argument
+
+ python train.py mnist.pkl.gz --use_gpu
Added: incubator/singa/site/trunk/en/_sources/docs/net.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/net.rst.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/net.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/net.rst.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,26 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+FeedForward Net
+===============
+
+.. automodule:: singa.net
+ :members:
+ :member-order: bysource
+ :show-inheritance:
+ :undoc-members:
Added: incubator/singa/site/trunk/en/_sources/docs/neural-net.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/neural-net.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/neural-net.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/neural-net.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,326 @@
+# Neural Net
+
+
+`NeuralNet` in SINGA represents an instance of user's neural net model. As the
+neural net typically consists of a set of layers, `NeuralNet` comprises
+a set of unidirectionally connected [Layer](layer.html)s.
+This page describes how to convert an user's neural net into
+the configuration of `NeuralNet`.
+
+<img src="../_static/images/model-category.png" align="center" width="200px"/>
+<span><strong>Figure 1 - Categorization of popular deep learning models.</strong></span>
+
+## Net structure configuration
+
+Users configure the `NeuralNet` by listing all layers of the neural net and
+specifying each layer's source layer names. Popular deep learning models can be
+categorized as Figure 1. The subsequent sections give details for each
+category.
+
+### Feed-forward models
+
+<div align = "left">
+<img src="../_static/images/mlp-net.png" align="center" width="200px"/>
+<span><strong>Figure 2 - Net structure of a MLP model.</strong></span>
+</div>
+
+Feed-forward models, e.g., CNN and MLP, can easily get configured as their layer
+connections are undirected without circles. The
+configuration for the MLP model shown in Figure 1 is as follows,
+
+ net {
+ layer {
+ name : 'data"
+ type : kData
+ }
+ layer {
+ name : 'image"
+ type : kImage
+ srclayer: 'data'
+ }
+ layer {
+ name : 'label"
+ type : kLabel
+ srclayer: 'data'
+ }
+ layer {
+ name : 'hidden"
+ type : kHidden
+ srclayer: 'image'
+ }
+ layer {
+ name : 'softmax"
+ type : kSoftmaxLoss
+ srclayer: 'hidden'
+ srclayer: 'label'
+ }
+ }
+
+### Energy models
+
+<img src="../_static/images/rbm-rnn.png" align="center" width="500px"/>
+<span><strong>Figure 3 - Convert connections in RBM and RNN.</strong></span>
+
+
+For energy models including RBM, DBM,
+etc., their connections are undirected (i.e., Category B). To represent these models using
+`NeuralNet`, users can simply replace each connection with two directed
+connections, as shown in Figure 3a. In other words, for each pair of connected layers, their source
+layer field should include each other's name.
+The full [RBM example](rbm.html) has
+detailed neural net configuration for a RBM model, which looks like
+
+ net {
+ layer {
+ name : "vis"
+ type : kVisLayer
+ param {
+ name : "w1"
+ }
+ srclayer: "hid"
+ }
+ layer {
+ name : "hid"
+ type : kHidLayer
+ param {
+ name : "w2"
+ share_from: "w1"
+ }
+ srclayer: "vis"
+ }
+ }
+
+### RNN models
+
+For recurrent neural networks (RNN), users can remove the recurrent connections
+by unrolling the recurrent layer. For example, in Figure 3b, the original
+layer is unrolled into a new layer with 4 internal layers. In this way, the
+model is like a normal feed-forward model, thus can be configured similarly.
+The [RNN example](rnn.html) has a full neural net
+configuration for a RNN model.
+
+
+## Configuration for multiple nets
+
+Typically, a training job includes three neural nets for
+training, validation and test phase respectively. The three neural nets share most
+layers except the data layer, loss layer or output layer, etc.. To avoid
+redundant configurations for the shared layers, users can uses the `exclude`
+filed to filter a layer in the neural net, e.g., the following layer will be
+filtered when creating the testing `NeuralNet`.
+
+
+ layer {
+ ...
+ exclude : kTest # filter this layer for creating test net
+ }
+
+
+
+## Neural net partitioning
+
+A neural net can be partitioned in different ways to distribute the training
+over multiple workers.
+
+### Batch and feature dimension
+
+<img src="../_static/images/partition_fc.png" align="center" width="400px"/>
+<span><strong>Figure 4 - Partitioning of a fully connected layer.</strong></span>
+
+
+Every layer's feature blob is considered a matrix whose rows are feature
+vectors. Thus, one layer can be split on two dimensions. Partitioning on
+dimension 0 (also called batch dimension) slices the feature matrix by rows.
+For instance, if the mini-batch size is 256 and the layer is partitioned into 2
+sub-layers, each sub-layer would have 128 feature vectors in its feature blob.
+Partitioning on this dimension has no effect on the parameters, as every
+[Param](param.html) object is replicated in the sub-layers. Partitioning on dimension
+1 (also called feature dimension) slices the feature matrix by columns. For
+example, suppose the original feature vector has 50 units, after partitioning
+into 2 sub-layers, each sub-layer would have 25 units. This partitioning may
+result in [Param](param.html) object being split, as shown in
+Figure 4. Both the bias vector and weight matrix are
+partitioned into two sub-layers.
+
+
+### Partitioning configuration
+
+There are 4 partitioning schemes, whose configurations are give below,
+
+ 1. Partitioning each singe layer into sub-layers on batch dimension (see
+ below). It is enabled by configuring the partition dimension of the layer to
+ 0, e.g.,
+
+ # with other fields omitted
+ layer {
+ partition_dim: 0
+ }
+
+ 2. Partitioning each singe layer into sub-layers on feature dimension (see
+ below). It is enabled by configuring the partition dimension of the layer to
+ 1, e.g.,
+
+ # with other fields omitted
+ layer {
+ partition_dim: 1
+ }
+
+ 3. Partitioning all layers into different subsets. It is enabled by
+ configuring the location ID of a layer, e.g.,
+
+ # with other fields omitted
+ layer {
+ location: 1
+ }
+ layer {
+ location: 0
+ }
+
+
+ 4. Hybrid partitioning of strategy 1, 2 and 3. The hybrid partitioning is
+ useful for large models. An example application is to implement the
+ [idea proposed by Alex](http://arxiv.org/abs/1404.5997).
+ Hybrid partitioning is configured like,
+
+ # with other fields omitted
+ layer {
+ location: 1
+ }
+ layer {
+ location: 0
+ }
+ layer {
+ partition_dim: 0
+ location: 0
+ }
+ layer {
+ partition_dim: 1
+ location: 0
+ }
+
+Currently SINGA supports strategy-2 well. Other partitioning strategies are
+are under test and will be released in later version.
+
+## Parameter sharing
+
+Parameters can be shared in two cases,
+
+ * sharing parameters among layers via user configuration. For example, the
+ visible layer and hidden layer of a RBM shares the weight matrix, which is configured through
+ the `share_from` field as shown in the above RBM configuration. The
+ configurations must be the same (except name) for shared parameters.
+
+ * due to neural net partitioning, some `Param` objects are replicated into
+ different workers, e.g., partitioning one layer on batch dimension. These
+ workers share parameter values. SINGA controls this kind of parameter
+ sharing automatically, users do not need to do any configuration.
+
+ * the `NeuralNet` for training and testing (and validation) share most layers
+ , thus share `Param` values.
+
+If the shared `Param` instances resident in the same process (may in different
+threads), they use the same chunk of memory space for their values. But they
+would have different memory spaces for their gradients. In fact, their
+gradients will be averaged by the stub or server.
+
+## Advanced user guide
+
+### Creation
+
+ static NeuralNet* NeuralNet::Create(const NetProto& np, Phase phase, int num);
+
+The above function creates a `NeuralNet` for a given phase, and returns a
+pointer to the `NeuralNet` instance. The phase is in {kTrain,
+kValidation, kTest}. `num` is used for net partitioning which indicates the
+number of partitions. Typically, a training job includes three neural nets for
+training, validation and test phase respectively. The three neural nets share most
+layers except the data layer, loss layer or output layer, etc.. The `Create`
+function takes in the full net configuration including layers for training,
+validation and test. It removes layers for phases other than the specified
+phase based on the `exclude` field in
+[layer configuration](layer.html):
+
+ layer {
+ ...
+ exclude : kTest # filter this layer for creating test net
+ }
+
+The filtered net configuration is passed to the constructor of `NeuralNet`:
+
+ NeuralNet::NeuralNet(NetProto netproto, int npartitions);
+
+The constructor creates a graph representing the net structure firstly in
+
+ Graph* NeuralNet::CreateGraph(const NetProto& netproto, int npartitions);
+
+Next, it creates a layer for each node and connects layers if their nodes are
+connected.
+
+ void NeuralNet::CreateNetFromGraph(Graph* graph, int npartitions);
+
+Since the `NeuralNet` instance may be shared among multiple workers, the
+`Create` function returns a pointer to the `NeuralNet` instance .
+
+### Parameter sharing
+
+ `Param` sharing
+is enabled by first sharing the Param configuration (in `NeuralNet::Create`)
+to create two similar (e.g., the same shape) Param objects, and then calling
+(in `NeuralNet::CreateNetFromGraph`),
+
+ void Param::ShareFrom(const Param& from);
+
+It is also possible to share `Param`s of two nets, e.g., sharing parameters of
+the training net and the test net,
+
+ void NeuralNet:ShareParamsFrom(NeuralNet* other);
+
+It will call `Param::ShareFrom` for each Param object.
+
+### Access functions
+`NeuralNet` provides a couple of access function to get the layers and params
+of the net:
+
+ const std::vector<Layer*>& layers() const;
+ const std::vector<Param*>& params() const ;
+ Layer* name2layer(string name) const;
+ Param* paramid2param(int id) const;
+
+
+### Partitioning
+
+
+#### Implementation
+
+SINGA partitions the neural net in `CreateGraph` function, which creates one
+node for each (partitioned) layer. For example, if one layer's partition
+dimension is 0 or 1, then it creates `npartition` nodes for it; if the
+partition dimension is -1, a single node is created, i.e., no partitioning.
+Each node is assigned a partition (or location) ID. If the original layer is
+configured with a location ID, then the ID is assigned to each newly created node.
+These nodes are connected according to the connections of the original layers.
+Some connection layers will be added automatically.
+For instance, if two connected sub-layers are located at two
+different workers, then a pair of bridge layers is inserted to transfer the
+feature (and gradient) blob between them. When two layers are partitioned on
+different dimensions, a concatenation layer which concatenates feature rows (or
+columns) and a slice layer which slices feature rows (or columns) would be
+inserted. These connection layers help making the network communication and
+synchronization transparent to the users.
+
+#### Dispatching partitions to workers
+
+Each (partitioned) layer is assigned a location ID, based on which it is dispatched to one
+worker. Particularly, the pointer to the `NeuralNet` instance is passed
+to every worker within the same group, but each worker only computes over the
+layers that have the same partition (or location) ID as the worker's ID. When
+every worker computes the gradients of the entire model parameters
+(strategy-2), we refer to this process as data parallelism. When different
+workers compute the gradients of different parameters (strategy-3 or
+strategy-1), we call this process model parallelism. The hybrid partitioning
+leads to hybrid parallelism where some workers compute the gradients of the
+same subset of model parameters while other workers compute on different model
+parameters. For example, to implement the hybrid parallelism in for the
+[DCNN model](http://arxiv.org/abs/1404.5997), we set `partition_dim = 0` for
+lower layers and `partition_dim = 1` for higher layers.
+
Added: incubator/singa/site/trunk/en/_sources/docs/notebook/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/notebook/README.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/notebook/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/notebook/README.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,3 @@
+These are some examples in IPython notebooks.
+
+You can open them in [notebook viewer](http://nbviewer.jupyter.org/github/apache/incubator-singa/blob/master/doc/en/docs/notebook/index.ipynb).
Added: incubator/singa/site/trunk/en/_sources/docs/optimizer.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/optimizer.rst.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/optimizer.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/optimizer.rst.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,29 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Optimizer
+=========
+
+
+.. automodule:: singa.optimizer
+ :members:
+ :member-order: bysource
+ :show-inheritance:
+ :undoc-members:
+
+
Added: incubator/singa/site/trunk/en/_sources/docs/snapshot.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/snapshot.rst.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/snapshot.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/snapshot.rst.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,24 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Snapshot
+========
+
+
+.. automodule:: singa.snapshot
+ :members:
Added: incubator/singa/site/trunk/en/_sources/docs/software_stack.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/software_stack.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/software_stack.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/software_stack.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,99 @@
+# Software Stack
+
+SINGA's software stack includes three major components, namely, core, IO and
+model. Figure 1 illustrates these components together with the hardware.
+The core component provides memory management and tensor operations;
+IO has classes for reading (and writing) data from (to) disk and network; The
+model component provides data structures and algorithms for machine learning models,
+e.g., layers for neural network models, optimizers/initializer/metric/loss for
+general machine learning models.
+
+
+<img src="../_static/images/singav1-sw.png" align="center" width="500px"/>
+<br/>
+<span><strong>Figure 1 - SINGA V1 software stack.</strong></span>
+
+## Core
+
+[Tensor](tensor.html) and [Device](device.html) are two core abstractions in SINGA. Tensor class represents a
+multi-dimensional array, which stores model variables and provides linear algebra
+operations for machine learning
+algorithms, including matrix multiplication and random functions. Each tensor
+instance (i.e. a tensor) is allocated on a Device instance.
+Each Device instance (i.e. a device) is created against one hardware device,
+e.g. a GPU card or a CPU core. Devices manage the memory of tensors and execute
+tensor operations on its execution units, e.g. CPU threads or CUDA streams.
+
+Depending on the hardware and the programming language, SINGA have implemented
+the following specific device classes:
+
+* **CudaGPU** represents an Nvidia GPU card. The execution units are the CUDA streams.
+* **CppCPU** represents a normal CPU. The execution units are the CPU threads.
+* **OpenclGPU** represents normal GPU card from both Nvidia and AMD.
+ The execution units are the CommandQueues. Given that OpenCL is compatible with
+ many hardware devices, e.g. FPGA and ARM, the OpenclGPU has the potential to be
+ extended for other devices.
+
+Different types of devices use different programming languages to write the kernel
+functions for tensor operations,
+
+* CppMath (tensor_math_cpp.h) implements the tensor operations using Cpp for CppCPU
+* CudaMath (tensor_math_cuda.h) implements the tensor operations using CUDA for CudaGPU
+* OpenclMath (tensor_math_opencl.h) implements the tensor operations using OpenCL for OpenclGPU
+
+In addition, different types of data, such as float32 and float16, could be supported by adding
+the corresponding tensor functions.
+
+Typically, users would create a device instance and pass it to create multiple
+tensor instances. When users call the Tensor functions, these function would invoke
+the corresponding implementation (CppMath/CudaMath/OpenclMath) automatically. In
+other words, the implementation of Tensor operations is transparent to users.
+
+Most machine learning algorithms could be expressed using (dense or sparse) tensors.
+Therefore, with the Tensor abstraction, SINGA would be able to run a wide range of models,
+including deep learning models and other traditional machine learning models.
+
+The Tensor and Device abstractions are extensible to support a wide range of hardware device
+using different programming languages. A new hardware device would be supported by
+adding a new Device subclass and the corresponding implementation of the Tensor
+operations (xxxMath).
+
+Optimizations in terms of speed and memory could be implemented by Device, which
+manages both operation execution and memory malloc/free. More optimization details
+would be described in the [Device page](device.html).
+
+
+## Model
+
+On top of the Tensor and Device abstractions, SINGA provides some higher level
+classes for machine learning modules.
+
+* [Layer](layer.html) and its subclasses are specific for neural networks. Every layer provides
+ functions for forward propagating features and backward propagating gradients w.r.t the training loss functions.
+ They wraps the complex layer operations so that users can easily create neural nets
+ by connecting a set of layers.
+
+* [Initializer](initializer.html) and its subclasses provide variant methods of initializing
+ model parameters (stored in Tensor instances), following Uniform, Gaussian, etc.
+
+* [Loss](loss.html) and its subclasses defines the training objective loss functions.
+ Both functions of computing the loss values and computing the gradient of the prediction w.r.t the
+ objective loss are implemented. Example loss functions include squared error and cross entropy.
+
+* [Metric](metric.html) and its subclasses provide the function to measure the
+ performance of the model, e.g., the accuracy.
+
+* [Optimizer](optimizer.html) and its subclasses implement the methods for updating
+ model parameter values using parameter gradients, including SGD, AdaGrad, RMSProp etc.
+
+
+## IO
+
+The IO module consists of classes for data loading, data preprocessing and message passing.
+
+* Reader and its subclasses load string records from disk files
+* Writer and its subclasses write string records to disk files
+* Encoder and its subclasses encode Tensor instances into string records
+* Decoder and its subclasses decodes string records into Tensor instances
+* Endpoint represents a communication endpoint which provides functions for passing messages to each other.
+* Message represents communication messages between Endpoint instances. It carries both meta data and payload.
Added: incubator/singa/site/trunk/en/_sources/docs/tensor.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/tensor.rst.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/tensor.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/tensor.rst.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,48 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Tensor
+========
+
+Each Tensor instance is a multi-dimensional array allocated on a specific
+Device instance. Tensor instances store variables and provide
+linear algebra operations over different types of hardware devices without user
+awareness. Note that users need to make sure the tensor operands are
+allocated on the same device except copy functions.
+
+
+Tensor implementation
+---------------------
+
+SINGA has three different sets of implmentations of Tensor functions, one for each
+type of Device.
+
+* 'tensor_math_cpp.h' implements operations using Cpp (with CBLAS) for CppGPU devices.
+* 'tensor_math_cuda.h' implements operations using Cuda (with cuBLAS) for CudaGPU devices.
+* 'tensor_math_opencl.h' implements operations using OpenCL for OpenclGPU devices.
+
+Python API
+----------
+
+
+.. automodule:: singa.tensor
+ :members:
+
+
+CPP API
+---------
Added: incubator/singa/site/trunk/en/_sources/docs/utils.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/utils.rst.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/utils.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/utils.rst.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,24 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Utils
+=========
+
+
+.. automodule:: singa.utils
+ :members:
Added: incubator/singa/site/trunk/en/_sources/downloads.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/downloads.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/downloads.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/downloads.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,108 @@
+## Download SINGA
+
+* Latest code: please clone the dev branch from [Github](https://github.com/apache/incubator-singa)
+
+* v1.1.0 (12 February 2017):
+ * [Apache SINGA 1.1.0](http://www.apache.org/dyn/closer.cgi/incubator/singa/1.1.0/apache-singa-incubating-1.1.0.tar.gz)
+ [\[MD5\]](https://dist.apache.org/repos/dist/release/incubator/singa/1.1.0/apache-singa-incubating-1.1.0.tar.gz.md5)
+ [\[KEYS\]](https://dist.apache.org/repos/dist/release/incubator/singa/1.1.0/KEYS)
+ * [Release Notes 1.1.0](releases/RELEASE_NOTES_1.1.0.html)
+ * New features and major updates,
+ * Create Docker images (CPU and GPU versions)
+ * Create Amazon AMI for SINGA (CPU version)
+ * Integrate with Jenkins for automatically generating Wheel and Debian packages (for installation), and updating the website.
+ * Enhance the FeedFowardNet, e.g., multiple inputs and verbose mode for debugging
+ * Add Concat and Slice layers
+ * Extend CrossEntropyLoss to accept instance with multiple labels
+ * Add image_tool.py with image augmentation methods
+ * Support model loading and saving via the Snapshot API
+ * Compile SINGA source on Windows
+ * Compile mandatory dependent libraries together with SINGA code
+ * Enable Java binding (basic) for SINGA
+ * Add version ID in checkpointing files
+ * Add Rafiki toolkit for providing RESTFul APIs
+ * Add examples pretrained from Caffe, including GoogleNet
+
+
+
+* v1.0.0 (8 September 2016):
+ * [Apache SINGA 1.0.0](https://archive.apache.org/dist/incubator/singa/1.0.0/apache-singa-incubating-1.0.0.tar.gz)
+ [\[MD5\]](https://archive.apache.org/dist/incubator/singa/1.0.0/apache-singa-incubating-1.0.0.tar.gz.md5)
+ [\[KEYS\]](https://archive.apache.org/dist/incubator/singa//1.0.0/KEYS)
+ * [Release Notes 1.0.0](releases/RELEASE_NOTES_1.0.0.html)
+ * New features and major updates,
+ * Tensor abstraction for supporting more machine learning models.
+ * Device abstraction for running on different hardware devices, including CPU, (Nvidia/AMD) GPU and FPGA (to be tested in later versions).
+ * Replace GNU autotool with cmake for compilation.
+ * Support Mac OS
+ * Improve Python binding, including installation and programming
+ * More deep learning models, including VGG and ResNet
+ * More IO classes for reading/writing files and encoding/decoding data
+ * New network communication components directly based on Socket.
+ * Cudnn V5 with Dropout and RNN layers.
+ * Replace website building tool from maven to Sphinx
+ * Integrate Travis-CI
+
+
+* v0.3.0 (20 April 2016):
+ * [Apache SINGA 0.3.0](https://archive.apache.org/dist/incubator/singa/0.3.0/apache-singa-incubating-0.3.0.tar.gz)
+ [\[MD5\]](https://archive.apache.org/dist/incubator/singa/0.3.0/apache-singa-incubating-0.3.0.tar.gz.md5)
+ [\[KEYS\]](https://archive.apache.org/dist/incubator/singa/0.3.0/KEYS)
+ * [Release Notes 0.3.0](releases/RELEASE_NOTES_0.3.0.html)
+ * New features and major updates,
+ * [Training on GPU cluster](v0.3.0/gpu.html) enables training of deep learning models over a GPU cluster.
+ * [Python wrapper improvement](v0.3.0/python.html) makes it easy to configure the job, including neural net and SGD algorithm.
+ * [New SGD updaters](v0.3.0/updater.html) are added, including Adam, AdaDelta and AdaMax.
+ * [Installation](v0.3.0/installation.html) has fewer dependent libraries for single node training.
+ * Heterogeneous training with CPU and GPU.
+ * Support cuDNN V4.
+ * Data prefetching.
+ * Fix some bugs.
+
+
+
+* v0.2.0 (14 January 2016):
+ * [Apache SINGA 0.2.0](https://archive.apache.org/dist/incubator/singa/0.2.0/apache-singa-incubating-0.2.0.tar.gz)
+ [\[MD5\]](https://archive.apache.org/dist/incubator/singa/0.2.0/apache-singa-incubating-0.2.0.tar.gz.md5)
+ [\[KEYS\]](https://archive.apache.org/dist/incubator/singa/0.2.0/KEYS)
+ * [Release Notes 0.2.0](releases/RELEASE_NOTES_0.2.0.html)
+ * New features and major updates,
+ * [Training on GPU](v0.2.0/gpu.html) enables training of complex models on a single node with multiple GPU cards.
+ * [Hybrid neural net partitioning](v0.2.0/hybrid.html) supports data and model parallelism at the same time.
+ * [Python wrapper](v0.2.0/python.html) makes it easy to configure the job, including neural net and SGD algorithm.
+ * [RNN model and BPTT algorithm](v0.2.0/general-rnn.html) are implemented to support applications based on RNN models, e.g., GRU.
+ * [Cloud software integration](v0.2.0/distributed-training.html) includes Mesos, Docker and HDFS.
+ * Visualization of neural net structure and layer information, which is helpful for debugging.
+ * Linear algebra functions and random functions against Blobs and raw data pointers.
+ * New layers, including SoftmaxLayer, ArgSortLayer, DummyLayer, RNN layers and cuDNN layers.
+ * Update Layer class to carry multiple data/grad Blobs.
+ * Extract features and test performance for new data by loading previously trained model parameters.
+ * Add Store class for IO operations.
+
+
+* v0.1.0 (8 October 2015):
+ * [Apache SINGA 0.1.0](https://archive.apache.org/dist/incubator/singa/apache-singa-incubating-0.1.0.tar.gz)
+ [\[MD5\]](https://archive.apache.org/dist/incubator/singa/apache-singa-incubating-0.1.0.tar.gz.md5)
+ [\[KEYS\]](https://archive.apache.org/dist/incubator/singa/KEYS)
+ * [Amazon EC2 image](https://console.aws.amazon.com/ec2/v2/home?region=ap-southeast-1#LaunchInstanceWizard:ami=ami-b41001e6)
+ * [Release Notes 0.1.0](releases/RELEASE_NOTES_0.1.0.html)
+ * Major features include,
+ * Installation using GNU build utility
+ * Scripts for job management with zookeeper
+ * Programming model based on NeuralNet and Layer abstractions.
+ * System architecture based on Worker, Server and Stub.
+ * Training models from three different model categories, namely, feed-forward models, energy models and RNN models.
+ * Synchronous and asynchronous distributed training frameworks using CPU
+ * Checkpoint and restore
+ * Unit test using gtest
+
+**Disclaimer**
+
+Apache SINGA is an effort undergoing incubation at The Apache Software
+Foundation (ASF), sponsored by the name of Apache Incubator PMC. Incubation is
+required of all newly accepted projects until a further review indicates that
+the infrastructure, communications, and decision making process have stabilized
+in a manner consistent with other successful ASF projects. While incubation
+status is not necessarily a reflection of the completeness or stability of the
+code, it does indicate that the project has yet to be fully endorsed by the
+ASF.
Added: incubator/singa/site/trunk/en/_sources/index.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/index.rst.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/index.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/index.rst.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,134 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+.. Singa documentation master file, created by
+ sphinx-quickstart on Sat Jul 9 20:36:57 2016.
+ You can adapt this file completely to your liking, but it should at least
+ contain the root `toctree` directive.
+Welcome to Apache Singa
+=======================
+
+Recent News
+-----------
+
+* **Version 1.1** is now available, 12 Feb, 2017. `Download SINGA v1.1.0 <downloads.html>`_
+
+* A tutorial on SINGA V1 will be given at `SGInnovate <https://www.eventbrite.sg/e/ai-eveningssginnovate-apache-singa-tickets-31505061487>`_, on 23 March, 2017
+
+* **Version 1** is now available, 9 Sep, 2016. `Download SINGA v1.0.0 <downloads.html>`_
+
+* SINGA will be presented at `REWORK <https://www.re-work.co/events/deep-learning-singapore/schedule>`_, 21 Oct, 2016.
+
+* SINGA was presented at `PyDataSG <http://www.meetup.com/PyData-SG/events/229691286/>`_, 16 Aug, 2016.
+
+* The **third release** is now available, 20 April, 2016. `Download SINGA v0.3.0 <downloads.html>`_
+
+* The **second release** is now available, 14 Jan, 2016. `Download SINGA v0.2.0 <downloads.html>`_.
+
+* SINGA will be presented at `Strata+Hadoop <http://strataconf.com/big-data-conference-sg-2015/public/schedule/detail/45123>`_ on 2 Dec, 2015
+
+* SINGA was presented at `ACM Multimedia <http://www.acmmm.org/2015/at-a-glance/>`_ Best Paper session and Open Source Software Competition session, 26-30 Oct, 2015 (`Slides <http://www.comp.nus.edu.sg/~dbsystem/singa//assets/file/mm2015.ppt>`_)
+
+* The **first release** is now available, 8 Oct, 2015. `Download SINGA v0.1.0 <downloads.html>`_.
+
+* SINGA was presented at `workshop on deep learning <http://www.comp.nus.edu.sg/~dbsystem/singa/workshop>`_ held on 16 Sep, 2015
+
+* SINGA was presented at `BOSS <http://boss.dima.tu-berlin.de/>`_ of `VLDB 2015 <http://www.vldb.org/2015/>`_ at Hawaii, 4 Sep, 2015. (slides: `overview <http://www.comp.nus.edu.sg/~dbsystem/singa/assets/file/singa-vldb-boss.pptx>`_, `basic <http://www.comp.nus.edu.sg/~dbsystem/singa/assets/file/basic-user-guide.pptx>`_, `advanced <http://www.comp.nus.edu.sg/~dbsystem/singa/assets/file/advanced-user-guide.pptx>`_)
+
+* SINGA was presented at `ADSC/I2R Deep Learning Workshop <http://adsc.illinois.edu/contact-us>`_, 25 Aug, 2015.
+
+* A tutorial on SINGA was given at VLDB summer school at Tsinghua University, 25-31 July, 2015.
+
+* A half day tutorial on SINGA was given at I2R, 29 June, 2015.
+
+* SINGA was presented at `DanaC <http://danac.org/>`_ of `SIGMOD 2015 <http://www.sigmod2015.org/index.shtml>`_ at Melbourne, 31 May - 4 June, 2015.
+
+* SINGA has been accepted by `Apache Incubator <http://incubator.apache.org/>`_, 17 March, 2015.
+
+Getting Started
+---------------
+* Try SINGA on `AWS <https://aws.amazon.com/marketplace/pp/B01NAUAWZW>`_ or via `Docker <https://hub.docker.com/r/nusdbsystem/singa/>`_.
+
+* Install SINGA via `python wheel files <./docs/installation.html#from-wheel>`_, `Debian packages <./docs/installation.html#from-debian-package>`_ or from `source <./docs/installation.html#from-source>`_.
+
+* Refer to the `Jupyter notebooks <http://nbviewer.jupyter.org/github/apache/incubator-singa/blob/master/doc/en/docs/notebook/index.ipynb>`_ for some basic examples and the `model zoo page <./docs/model_zoo/index.html>`_ for more examples.
+
+Documentation
+-------------
+
+* Documentation and Python APIs are listed `here <docs.html>`_.
+* `C++ APIs <http://www.comp.nus.edu.sg/~dbsystem/singa/api/>`_ are generated by Doxygen.
+* Research publication list is available `here <http://www.comp.nus.edu.sg/~dbsystem/singa/research/publication/>`_.
+
+How to contribute
+----------------------
+
+* Please subscribe to our development mailing list dev-subscribe@singa.incubator.apache.org.
+
+* If you find any issues using SINGA, please report it to the `Issue Tracker <https://issues.apache.org/jira/browse/singa>`_.
+
+* You can also contact with `SINGA committers <community/team-list.html>`_ directly.
+
+More details on contributing to SINGA is described `here <develop/how-contribute.html>`_ .
+
+Citing SINGA
+------------
+
+Please cite the following two papers if you use SINGA in your research:
+
+* B. C. Ooi, K.-L. Tan, S. Wang, W. Wang, Q. Cai, G. Chen, J. Gao, Z. Luo, A. K. H. Tung, Y. Wang, Z. Xie, M. Zhang, and K. Zheng. `SINGA: A distributed deep learning platform <http://www.comp.nus.edu.sg/~ooibc/singaopen-mm15.pdf>`_. ACM Multimedia (Open Source Software Competition) 2015 (`BibTex <http://www.comp.nus.edu.sg/~dbsystem/singa//assets/file/bib-oss.txt>`_).
+
+* W. Wang, G. Chen, T. T. A. Dinh, B. C. Ooi, K.-L.Tan, J. Gao, and S. Wang. `SINGA: putting deep learning in the hands of multimedia users <http://www.comp.nus.edu.sg/~ooibc/singa-mm15.pdf>`_. ACM Multimedia 2015 (`BibTex <http://www.comp.nus.edu.sg/~dbsystem/singa//assets/file/bib-singa.txt>`_, `Slides <files/mm2015.ppt>`_).
+
+.. toctree::
+ :hidden:
+
+ downloads
+ docs/index
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+ :caption: Development
+
+ develop/schedule
+ develop/how-contribute
+ develop/contribute-code
+ develop/contribute-docs
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+ :caption: Community
+
+ community/source-repository
+ community/mail-lists
+ community/issue-tracking
+ community/team-list
+
+
+
+License
+----------
+SINGA is released under `Apache License Version 2.0 <http://www.apache.org/licenses/LICENSE-2.0>`_.
+
+Disclaimers
+-----------
+
+Apache SINGA is an effort undergoing incubation at The Apache Software Foundation (ASF), sponsored by the Apache Incubator. Incubation is required of all newly accepted projects until a further review indicates that the infrastructure, communications, and decision making process have stabilized in a manner consistent with other successful ASF projects. While incubation status is not necessarily a reflection of the completeness or stability of the code, it does indicate that the project has yet to be fully endorsed by the ASF.
+
Added: incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.1.0.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.1.0.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.1.0.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.1.0.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,99 @@
+#singa-incubating-0.1.0 Release Notes
+
+---
+
+SINGA is a general distributed deep learning platform for training big deep learning models over large datasets. It is
+designed with an intuitive programming model based on the layer abstraction. SINGA supports a wide variety of popular
+deep learning models.
+
+This release includes following features:
+
+ * Job management
+ * [SINGA-3](https://issues.apache.org/jira/browse/SINGA-3) Use Zookeeper to check stopping (finish) time of the system
+ * [SINGA-16](https://issues.apache.org/jira/browse/SINGA-16) Runtime Process id Management
+ * [SINGA-25](https://issues.apache.org/jira/browse/SINGA-25) Setup glog output path
+ * [SINGA-26](https://issues.apache.org/jira/browse/SINGA-26) Run distributed training in a single command
+ * [SINGA-30](https://issues.apache.org/jira/browse/SINGA-30) Enhance easy-to-use feature and support concurrent jobs
+ * [SINGA-33](https://issues.apache.org/jira/browse/SINGA-33) Automatically launch a number of processes in the cluster
+ * [SINGA-34](https://issues.apache.org/jira/browse/SINGA-34) Support external zookeeper service
+ * [SINGA-38](https://issues.apache.org/jira/browse/SINGA-38) Support concurrent jobs
+ * [SINGA-39](https://issues.apache.org/jira/browse/SINGA-39) Avoid ssh in scripts for single node environment
+ * [SINGA-43](https://issues.apache.org/jira/browse/SINGA-43) Remove Job-related output from workspace
+ * [SINGA-56](https://issues.apache.org/jira/browse/SINGA-56) No automatic launching of zookeeper service
+ * [SINGA-73](https://issues.apache.org/jira/browse/SINGA-73) Refine the selection of available hosts from host list
+
+
+ * Installation with GNU Auto tool
+ * [SINGA-4](https://issues.apache.org/jira/browse/SINGA-4) Refine thirdparty-dependency installation
+ * [SINGA-13](https://issues.apache.org/jira/browse/SINGA-13) Separate intermediate files of compilation from source files
+ * [SINGA-17](https://issues.apache.org/jira/browse/SINGA-17) Add root permission within thirdparty/install.
+ * [SINGA-27](https://issues.apache.org/jira/browse/SINGA-27) Generate python modules for proto objects
+ * [SINGA-53](https://issues.apache.org/jira/browse/SINGA-53) Add lmdb compiling options
+ * [SINGA-62](https://issues.apache.org/jira/browse/SINGA-62) Remove building scrips and auxiliary files
+ * [SINGA-67](https://issues.apache.org/jira/browse/SINGA-67) Add singatest into build targets
+
+
+ * Distributed training
+ * [SINGA-7](https://issues.apache.org/jira/browse/SINGA-7) Implement shared memory Hogwild algorithm
+ * [SINGA-8](https://issues.apache.org/jira/browse/SINGA-8) Implement distributed Hogwild
+ * [SINGA-19](https://issues.apache.org/jira/browse/SINGA-19) Slice large Param objects for load-balance
+ * [SINGA-29](https://issues.apache.org/jira/browse/SINGA-29) Update NeuralNet class to enable layer partition type customization
+ * [SINGA-24](https://issues.apache.org/jira/browse/SINGA-24) Implement Downpour training framework
+ * [SINGA-32](https://issues.apache.org/jira/browse/SINGA-32) Implement AllReduce training framework
+ * [SINGA-57](https://issues.apache.org/jira/browse/SINGA-57) Improve Distributed Hogwild
+
+
+ * Training algorithms for different model categories
+ * [SINGA-9](https://issues.apache.org/jira/browse/SINGA-9) Add Support for Restricted Boltzman Machine (RBM) model
+ * [SINGA-10](https://issues.apache.org/jira/browse/SINGA-10) Add Support for Recurrent Neural Networks (RNN)
+
+
+ * Checkpoint and restore
+ * [SINGA-12](https://issues.apache.org/jira/browse/SINGA-12) Support Checkpoint and Restore
+
+
+ * Unit test
+ * [SINGA-64](https://issues.apache.org/jira/browse/SINGA-64) Add the test module for utils/common
+
+
+ * Programming model
+ * [SINGA-36](https://issues.apache.org/jira/browse/SINGA-36) Refactor job configuration, driver program and scripts
+ * [SINGA-37](https://issues.apache.org/jira/browse/SINGA-37) Enable users to set parameter sharing in model configuration
+ * [SINGA-54](https://issues.apache.org/jira/browse/SINGA-54) Refactor job configuration to move fields in ModelProto out
+ * [SINGA-55](https://issues.apache.org/jira/browse/SINGA-55) Refactor main.cc and singa.h
+ * [SINGA-61](https://issues.apache.org/jira/browse/SINGA-61) Support user defined classes
+ * [SINGA-65](https://issues.apache.org/jira/browse/SINGA-65) Add an example of writing user-defined layers
+
+
+ * Other features
+ * [SINGA-6](https://issues.apache.org/jira/browse/SINGA-6) Implement thread-safe singleton
+ * [SINGA-18](https://issues.apache.org/jira/browse/SINGA-18) Update API for displaying performance metric
+ * [SINGA-77](https://issues.apache.org/jira/browse/SINGA-77) Integrate with Apache RAT
+
+
+Some bugs are fixed during the development of this release
+
+ * [SINGA-2](https://issues.apache.org/jira/browse/SINGA-2) Check failed: zsock_connect
+ * [SINGA-5](https://issues.apache.org/jira/browse/SINGA-5) Server early terminate when zookeeper singa folder is not initially empty
+ * [SINGA-15](https://issues.apache.org/jira/browse/SINGA-15) Fixg a bug from ConnectStub function which gets stuck for connecting layer_dealer_
+ * [SINGA-22](https://issues.apache.org/jira/browse/SINGA-22) Cannot find openblas library when it is installed in default path
+ * [SINGA-23](https://issues.apache.org/jira/browse/SINGA-23) Libtool version mismatch error.
+ * [SINGA-28](https://issues.apache.org/jira/browse/SINGA-28) Fix a bug from topology sort of Graph
+ * [SINGA-42](https://issues.apache.org/jira/browse/SINGA-42) Issue when loading checkpoints
+ * [SINGA-44](https://issues.apache.org/jira/browse/SINGA-44) A bug when reseting metric values
+ * [SINGA-46](https://issues.apache.org/jira/browse/SINGA-46) Fix a bug in updater.cc to scale the gradients
+ * [SINGA-47](https://issues.apache.org/jira/browse/SINGA-47) Fix a bug in data layers that leads to out-of-memory when group size is too large
+ * [SINGA-48](https://issues.apache.org/jira/browse/SINGA-48) Fix a bug in trainer.cc that assigns the same NeuralNet instance to workers from diff groups
+ * [SINGA-49](https://issues.apache.org/jira/browse/SINGA-49) Fix a bug in HandlePutMsg func that sets param fields to invalid values
+ * [SINGA-66](https://issues.apache.org/jira/browse/SINGA-66) Fix bugs in Worker::RunOneBatch function and ClusterProto
+ * [SINGA-79](https://issues.apache.org/jira/browse/SINGA-79) Fix bug in singatool that can not parse -conf flag
+
+
+Features planned for the next release
+
+ * [SINGA-11](https://issues.apache.org/jira/browse/SINGA-11) Start SINGA using Mesos
+ * [SINGA-31](https://issues.apache.org/jira/browse/SINGA-31) Extend Blob to support xpu (cpu or gpu)
+ * [SINGA-35](https://issues.apache.org/jira/browse/SINGA-35) Add random number generators
+ * [SINGA-40](https://issues.apache.org/jira/browse/SINGA-40) Support sparse Param update
+ * [SINGA-41](https://issues.apache.org/jira/browse/SINGA-41) Support single node single GPU training
+
Added: incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.2.0.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.2.0.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.2.0.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.2.0.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,84 @@
+#singa-incubating-0.2.0 Release Notes
+
+---
+
+SINGA is a general distributed deep learning platform for training big deep
+learning models over large datasets. It is designed with an intuitive
+programming model based on the layer abstraction. SINGA supports a wide variety
+of popular deep learning models.
+
+This release includes the following **major features**:
+
+* [Training on GPU](../docs/gpu.html) enables training of complex models on a single node with multiple GPU cards.
+* [Hybrid neural net partitioning](../docs/hybrid.html) supports data and model parallelism at the same time.
+* [Python wrapper](../docs/python.html) makes it easy to configure the job, including neural net and SGD algorithm.
+* [RNN model and BPTT algorithm](../docs/general-rnn.html) are implemented to support applications based on RNN models, e.g., GRU.
+* [Cloud software integration](../docs/distributed-training.md) includes Mesos, Docker and HDFS.
+
+
+**More details** are listed as follows,
+
+ * Programming model
+ * [SINGA-80] New Blob Level and Address Level Math Operation Interface
+ * [SINGA-82] Refactor input layers using data store abstraction
+ * [SINGA-87] Replace exclude field to include field for layer configuration
+ * [SINGA-110] Add Layer member datavec_ and gradvec_
+ * [SINGA-120] Implemented GRU and BPTT (BPTTWorker)
+
+
+ * Neuralnet layers
+ * [SINGA-91] Add SoftmaxLayer and ArgSortLayer
+ * [SINGA-106] Add dummy layer for test purpose
+ * [SINGA-120] Implemented GRU and BPTT (GRULayer and OneHotLayer)
+
+
+ * GPU training support
+ * [SINGA-100] Implement layers using CUDNN for GPU training
+ * [SINGA-104] Add Context Class
+ * [SINGA-105] Update GUN make files for compiling cuda related code
+ * [SINGA-98] Add Support for AlexNet ImageNet Classification Model
+
+
+ * Model/Hybrid partition
+ * [SINGA-109] Refine bridge layers
+ * [SINGA-111] Add slice, concate and split layers
+ * [SINGA-113] Model/Hybrid Partition Support
+
+
+ * Python binding
+ * [SINGA-108] Add Python wrapper to singa
+
+
+ * Predict-only mode
+ * [SINGA-85] Add functions for extracting features and test new data
+
+
+ * Integrate with third-party tools
+ * [SINGA-11] Start SINGA on Apache Mesos
+ * [SINGA-78] Use Doxygen to generate documentation
+ * [SINGA-89] Add Docker support
+
+
+ * Unit test
+ * [SINGA-95] Add make test after building
+
+
+ * Other improvment
+ * [SINGA-84] Header Files Rearrange
+ * [SINGA-93] Remove the asterisk in the log tcp://169.254.12.152:*:49152
+ * [SINGA-94] Move call to google::InitGoogleLogging() from Driver::Init() to main()
+ * [SINGA-96] Add Momentum to Cifar10 Example
+ * [SINGA-101] Add ll (ls -l) command in .bashrc file when using docker
+ * [SINGA-114] Remove short logs in tmp directory
+ * [SINGA-115] Print layer debug information in the neural net graph file
+ * [SINGA-118] Make protobuf LayerType field id easy to assign
+ * [SIGNA-97] Add HDFS Store
+
+
+ * Bugs fixed
+ * [SINGA-85] Fix compilation errors in examples
+ * [SINGA-90] Miscellaneous trivial bug fixes
+ * [SINGA-107] Error from loading pre-trained params for training stacked RBMs
+ * [SINGA-116] Fix a bug in InnerProductLayer caused by weight matrix sharing
+
+
Added: incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.3.0.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.3.0.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.3.0.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_0.3.0.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,37 @@
+#singa-incubating-0.3.0 Release Notes
+
+---
+
+SINGA is a general distributed deep learning platform for training big deep
+learning models over large datasets. It is designed with an intuitive
+programming model based on the layer abstraction. SINGA supports a wide variety
+of popular deep learning models.
+
+This release includes following features:
+
+ * GPU Support
+ * [SINGA-131] Implement and optimize hybrid training using both CPU and GPU
+ * [SINGA-136] Support cuDNN v4
+ * [SINGA-134] Extend SINGA to run over a GPU cluster
+ * [SINGA-157] Change the priority of cudnn library and install libsingagpu.so
+
+ * Remove Dependences
+ * [SINGA-156] Remove the dependency on ZMQ for single process training
+ * [SINGA-155] Remove zookeeper for single-process training
+
+ * Python Binding
+ * [SINGA-126] Python Binding for Interactive Training
+
+ * Other Improvements
+ * [SINGA-80] New Blob Level and Address Level Math Operation Interface
+ * [SINGA-130] Data Prefetching
+ * [SINGA-145] New SGD based optimization Updaters: AdaDelta, Adam, AdamMax
+
+ * Bugs Fixed
+ * [SINGA-148] Race condition between Worker threads and Driver
+ * [SINGA-150] Mesos Docker container failed
+ * [SIGNA-141] Undesired Hash collision when locating process id to worker…
+ * [SINGA-149] Docker build fail
+ * [SINGA-143] The compilation cannot detect libsingagpu.so file
+
+
Added: incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_1.0.0.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_1.0.0.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_1.0.0.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_1.0.0.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,91 @@
+#singa-incubating-1.0.0 Release Notes
+
+---
+
+SINGA is a general distributed deep learning platform for training big deep
+learning models over large datasets. It is designed with an intuitive
+programming model based on the layer abstraction. SINGA supports a wide variety
+of popular deep learning models.
+
+This release includes following features:
+
+ * Core abstractions including Tensor and Device
+ * [SINGA-207] Update Tensor functions for matrices
+ * [SINGA-205] Enable slice and concatenate operations for Tensor objects
+ * [SINGA-197] Add CNMem as a submodule in lib/
+ * [SINGA-196] Rename class Blob to Block
+ * [SINGA-194] Add a Platform singleton
+ * [SINGA-175] Add memory management APIs and implement a subclass using CNMeM
+ * [SINGA-173] OpenCL Implementation
+ * [SINGA-171] Create CppDevice and CudaDevice
+ * [SINGA-168] Implement Cpp Math functions APIs
+ * [SINGA-162] Overview of features for V1.x
+ * [SINGA-165] Add cross-platform timer API to singa
+ * [SINGA-167] Add Tensor Math function APIs
+ * [SINGA-166] light built-in logging for making glog optional
+ * [SINGA-164] Add the base Tensor class
+
+
+ * IO components for file read/write, network and data pre-processing
+ * [SINGA-233] New communication interface
+ * [SINGA-215] Implement Image Transformation for Image Pre-processing
+ * [SINGA-214] Add LMDBReader and LMDBWriter for LMDB
+ * [SINGA-213] Implement Encoder and Decoder for CSV
+ * [SINGA-211] Add TextFileReader and TextFileWriter for CSV files
+ * [SINGA-210] Enable checkpoint and resume for v1.0
+ * [SINGA-208] Add DataIter base class and a simple implementation
+ * [SINGA-203] Add OpenCV detection for cmake compilation
+ * [SINGA-202] Add reader and writer for binary file
+ * [SINGA-200] Implement Encoder and Decoder for data pre-processing
+
+
+
+ * Module components including layer classes, training algorithms and Python binding
+ * [SINGA-235] Unify the engines for cudnn and singa layers
+ * [SINGA-230] OpenCL Convolution layer and Pooling layer
+ * [SINGA-222] Fixed bugs in IO
+ * [SINGA-218] Implementation for RNN CUDNN version
+ * [SINGA-204] Support the training of feed-forward neural nets
+ * [SINGA-199] Implement Python classes for SGD optimizers
+ * [SINGA-198] Change Layer::Setup API to include input Tensor shapes
+ * [SINGA-193] Add Python layers
+ * [SINGA-192] Implement optimization algorithms for Singa v1 (nesterove, adagrad, rmsprop)
+ * [SINGA-191] Add "autotune" for CudnnConvolution Layer
+ * [SINGA-190] Add prelu layer and flatten layer
+ * [SINGA-189] Generate python outputs of proto files
+ * [SINGA-188] Add Dense layer
+ * [SINGA-187] Add popular parameter initialization methods
+ * [SINGA-186] Create Python Tensor class
+ * [SINGA-184] Add Cross Entropy loss computation
+ * [SINGA-183] Add the base classes for optimizer, constraint and regularizer
+ * [SINGA-180] Add Activation layer and Softmax layer
+ * [SINGA-178] Add Convolution layer and Pooling layer
+ * [SINGA-176] Add loss and metric base classes
+ * [SINGA-174] Add Batch Normalization layer and Local Response Nomalization layer.
+ * [SINGA-170] Add Dropout layer and CudnnDropout layer.
+ * [SINGA-169] Add base Layer class for V1.0
+
+
+ * Examples
+ * [SINGA-232] Alexnet on Imagenet
+ * [SINGA-231] Batchnormlized VGG model for cifar-10
+ * [SINGA-228] Add Cpp Version of Convolution and Pooling layer
+ * [SINGA-227] Add Split and Merge Layer and add ResNet Implementation
+
+ * Documentation
+ * [SINGA-239] Transfer documentation files of v0.3.0 to github
+ * [SINGA-238] RBM on mnist
+ * [SINGA-225] Documentation for installation and Cifar10 example
+ * [SINGA-223] Use Sphinx to create the website
+
+ * Tools for compilation and some utility code
+ * [SINGA-229] Complete install targets
+ * [SINGA-221] Support for Travis-CI
+ * [SINGA-217] build python package with setup.py
+ * [SINGA-216] add jenkins for CI support
+ * [SINGA-212] Disable the compilation of libcnmem if USE_CUDA is OFF
+ * [SINGA-195] Channel for sending training statistics
+ * [SINGA-185] Add CBLAS and GLOG detection for singav1
+ * [SINGA-181] Add NVCC supporting for .cu files
+ * [SINGA-177] Add fully cmake supporting for the compilation of singa_v1
+ * [SINGA-172] Add CMake supporting for Cuda and Cudnn libs
Added: incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_1.1.0.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_1.1.0.md.txt?rev=1784386&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_1.1.0.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/releases/RELEASE_NOTES_1.1.0.md.txt Sat Feb 25 15:30:33 2017
@@ -0,0 +1,49 @@
+#singa-incubating-1.1.0 Release Notes
+
+---
+
+SINGA is a general distributed deep learning platform for training big deep
+learning models over large datasets.
+
+This release includes following features:
+
+ * Core components
+ * [SINGA-296] Add sign and to_host function for pysinga tensor module
+
+ * Model components
+ * [SINGA-254] Implement Adam for V1
+ * [SINGA-264] Extend the FeedForwardNet to accept multiple inputs
+ * [SINGA-267] Add spatial mode in batch normalization layer
+ * [SINGA-271] Add Concat and Slice layers
+ * [SINGA-275] Add Cross Entropy Loss for multiple labels
+ * [SINGA-278] Convert trained caffe parameters to singa
+ * [SINGA-287] Add memory size check for cudnn convolution
+
+ * Utility functions and CI
+ * [SINGA-242] Compile all source files into a single library.
+ * [SINGA-244] Separating swig interface and python binding files
+ * [SINGA-246] Imgtool for image augmentation
+ * [SINGA-247] Add windows support for singa
+ * [SINGA-251] Implement image loader for pysinga
+ * [SINGA-252] Use the snapshot methods to dump and load models for pysinga
+ * [SINGA-255] Compile mandatory dependent libaries together with SINGA code
+ * [SINGA-259] Add maven pom file for building java classes
+ * [SINGA-261] Add version ID into the checkpoint files
+ * [SINGA-266] Add Rafiki python toolkits
+ * [SINGA-273] Improve license and contributions
+ * [SINGA-284] Add python unittest into Jenkins and link static libs into whl file
+ * [SINGA-280] Jenkins CI support
+ * [SINGA-288] Publish wheel of PySINGA generated by Jenkins to public servers
+
+ * Documentation and usability
+ * [SINGA-263] Create Amazon Machine Image
+ * [SINGA-268] Add IPython notebooks to the documentation
+ * [SINGA-276] Create docker images
+ * [SINGA-289] Update SINGA website automatically using Jenkins
+ * [SINGA-295] Add an example of image classification using GoogleNet
+
+ * Bugs fixed
+ * [SINGA-245] float as the first operand can not multiply with a tensor object
+ * [SINGA-293] Bug from compiling PySINGA on Mac OS X with multiple version of Python
+
+
Modified: incubator/singa/site/trunk/en/_static/basic.css
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/basic.css?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
--- incubator/singa/site/trunk/en/_static/basic.css (original)
+++ incubator/singa/site/trunk/en/_static/basic.css Sat Feb 25 15:30:33 2017
@@ -52,6 +52,8 @@ div.sphinxsidebar {
width: 230px;
margin-left: -100%;
font-size: 90%;
+ word-wrap: break-word;
+ overflow-wrap : break-word;
}
div.sphinxsidebar ul {
@@ -83,10 +85,6 @@ div.sphinxsidebar #searchbox input[type=
width: 170px;
}
-div.sphinxsidebar #searchbox input[type="submit"] {
- width: 30px;
-}
-
img {
border: 0;
max-width: 100%;
@@ -124,6 +122,8 @@ ul.keywordmatches li.goodmatch a {
table.contentstable {
width: 90%;
+ margin-left: auto;
+ margin-right: auto;
}
table.contentstable p.biglink {
@@ -151,9 +151,14 @@ table.indextable td {
vertical-align: top;
}
-table.indextable dl, table.indextable dd {
+table.indextable ul {
margin-top: 0;
margin-bottom: 0;
+ list-style-type: none;
+}
+
+table.indextable > tbody > tr > td > ul {
+ padding-left: 0em;
}
table.indextable tr.pcap {
@@ -185,8 +190,22 @@ div.genindex-jumpbox {
padding: 0.4em;
}
+/* -- domain module index --------------------------------------------------- */
+
+table.modindextable td {
+ padding: 2px;
+ border-collapse: collapse;
+}
+
/* -- general body styles --------------------------------------------------- */
+div.body p, div.body dd, div.body li, div.body blockquote {
+ -moz-hyphens: auto;
+ -ms-hyphens: auto;
+ -webkit-hyphens: auto;
+ hyphens: auto;
+}
+
a.headerlink {
visibility: hidden;
}
@@ -212,10 +231,6 @@ div.body td {
text-align: left;
}
-.field-list ul {
- padding-left: 1em;
-}
-
.first {
margin-top: 0 !important;
}
@@ -332,10 +347,6 @@ table.docutils td, table.docutils th {
border-bottom: 1px solid #aaa;
}
-table.field-list td, table.field-list th {
- border: 0 !important;
-}
-
table.footnote td, table.footnote th {
border: 0 !important;
}
@@ -372,6 +383,20 @@ div.figure p.caption span.caption-number
div.figure p.caption span.caption-text {
}
+/* -- field list styles ----------------------------------------------------- */
+
+table.field-list td, table.field-list th {
+ border: 0 !important;
+}
+
+.field-list ul {
+ margin: 0;
+ padding-left: 1em;
+}
+
+.field-list p {
+ margin: 0;
+}
/* -- other body styles ----------------------------------------------------- */
@@ -422,15 +447,6 @@ dl.glossary dt {
font-size: 1.1em;
}
-.field-list ul {
- margin: 0;
- padding-left: 1em;
-}
-
-.field-list p {
- margin: 0;
-}
-
.optional {
font-size: 1.3em;
}
@@ -489,6 +505,13 @@ pre {
overflow-y: hidden; /* fixes display issues on Chrome browsers */
}
+span.pre {
+ -moz-hyphens: none;
+ -ms-hyphens: none;
+ -webkit-hyphens: none;
+ hyphens: none;
+}
+
td.linenos pre {
padding: 5px 0px;
border: 0;
@@ -580,6 +603,16 @@ span.eqno {
float: right;
}
+span.eqno a.headerlink {
+ position: relative;
+ left: 0px;
+ z-index: 1;
+}
+
+div.math:hover a.headerlink {
+ visibility: visible;
+}
+
/* -- printout stylesheet --------------------------------------------------- */
@media print {
Modified: incubator/singa/site/trunk/en/_static/comment-bright.png
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/comment-bright.png?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.
Modified: incubator/singa/site/trunk/en/_static/comment-close.png
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/comment-close.png?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.
Modified: incubator/singa/site/trunk/en/_static/comment.png
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/comment.png?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.
Modified: incubator/singa/site/trunk/en/_static/doctools.js
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/doctools.js?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
--- incubator/singa/site/trunk/en/_static/doctools.js (original)
+++ incubator/singa/site/trunk/en/_static/doctools.js Sat Feb 25 15:30:33 2017
@@ -124,6 +124,7 @@ var Documentation = {
this.fixFirefoxAnchorBug();
this.highlightSearchWords();
this.initIndexTable();
+
},
/**
@@ -252,6 +253,29 @@ var Documentation = {
});
var url = parts.join('/');
return path.substring(url.lastIndexOf('/') + 1, path.length - 1);
+ },
+
+ initOnKeyListeners: function() {
+ $(document).keyup(function(event) {
+ var activeElementType = document.activeElement.tagName;
+ // don't navigate when in search box or textarea
+ if (activeElementType !== 'TEXTAREA' && activeElementType !== 'INPUT' && activeElementType !== 'SELECT') {
+ switch (event.keyCode) {
+ case 37: // left
+ var prevHref = $('link[rel="prev"]').prop('href');
+ if (prevHref) {
+ window.location.href = prevHref;
+ return false;
+ }
+ case 39: // right
+ var nextHref = $('link[rel="next"]').prop('href');
+ if (nextHref) {
+ window.location.href = nextHref;
+ return false;
+ }
+ }
+ }
+ });
}
};
@@ -260,4 +284,4 @@ _ = Documentation.gettext;
$(document).ready(function() {
Documentation.init();
-});
+});
\ No newline at end of file
Modified: incubator/singa/site/trunk/en/_static/down-pressed.png
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/down-pressed.png?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.
Modified: incubator/singa/site/trunk/en/_static/down.png
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/down.png?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.
Modified: incubator/singa/site/trunk/en/_static/file.png
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/file.png?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.
Modified: incubator/singa/site/trunk/en/_static/fonts/Lato-Bold.ttf
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/fonts/Lato-Bold.ttf?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.
Modified: incubator/singa/site/trunk/en/_static/fonts/Lato-Regular.ttf
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/fonts/Lato-Regular.ttf?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.
Modified: incubator/singa/site/trunk/en/_static/fonts/fontawesome-webfont.eot
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_static/fonts/fontawesome-webfont.eot?rev=1784386&r1=1784385&r2=1784386&view=diff
==============================================================================
Binary files - no diff available.