You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@singa.apache.org by ji...@apache.org on 2015/07/15 15:05:57 UTC
svn commit: r1691203 -
/incubator/singa/site/trunk/content/markdown/docs/examples.md
Author: jinyang
Date: Wed Jul 15 13:05:56 2015
New Revision: 1691203
URL: http://svn.apache.org/r1691203
Log:
the examples
Added:
incubator/singa/site/trunk/content/markdown/docs/examples.md
Added: incubator/singa/site/trunk/content/markdown/docs/examples.md
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/content/markdown/docs/examples.md?rev=1691203&view=auto
==============================================================================
--- incubator/singa/site/trunk/content/markdown/docs/examples.md (added)
+++ incubator/singa/site/trunk/content/markdown/docs/examples.md Wed Jul 15 13:05:56 2015
@@ -0,0 +1,170 @@
+Title:
+Notice: Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+ .
+ http://www.apache.org/licenses/LICENSE-2.0
+ .
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+Here are the examples of SINGA, including MLP, CNN, RBM and RNN models. This tutorial will show you some basic information about how to configure SINGA.
+To run a SINGA job, you need to configure two files, model.conf to specify the deep learning model and cluster.conf to define the distributed training architecture.
+
+model.conf
+====
+model.conf is the file that configures the deep learning model you want to train.
+It should contain the neurualnet structure, training algorithm(backforward or contrastive divergence etc.),
+SGD update algorithm(e.g. Adagrad), number of training/test steps and training/test frequency,
+and display features and etc.
+SINGA will read model.conf as a Google protobuf class
+[ModelProto](https://github.com/apache/incubator-singa/blob/master/src/proto/model.proto).
+Here is a simple example simplified from our [MLP example](https://github.com/apache/incubator-singa/blob/master/examples/mnist/model.conf):
+
+ name: "simple-mlp"
+ train_steps: 1000
+ test_steps:10
+ test_frequency:60
+ display_frequency:30
+ alg: kBackPropagation
+ updater{
+ base_lr: 0.001
+ lr_change: kStep
+ type: kSGD
+ step_conf{
+ change_freq: 60
+ gamma: 0.997
+ }
+ }
+
+ neuralnet {
+ layer {
+ name: "data"
+ type: kShardData
+ sharddata_conf {
+ path: "examples/mnist/mnist_train_shard"
+ batchsize: 1000
+ }
+ exclude: kTest
+ }
+
+ layer {
+ name: "data"
+ type: kShardData
+ sharddata_conf {
+ path: "examples/mnist/mnist_test_shard"
+ batchsize: 1000
+ }
+ exclude: kTrain
+ }
+
+ layer{
+ name:"mnist"
+ type: kMnist
+ srclayers: "data"
+ mnist_conf {
+ norm_a: 127.5
+ norm_b: 1
+ }
+ }
+
+ layer{
+ name: "label"
+ type: kLabel
+ srclayers: "data"
+ }
+
+ layer{
+ name: "fc"
+ type: kInnerProduct
+ srclayers:"mnist"
+ innerproduct_conf{
+ num_output: 2500
+ }
+ param{
+ name: "weight"
+ init_method: kUniform
+ low:-0.05
+ high:0.05
+ }
+ param{
+ name: "bias"
+ init_method: kUniform
+ low: -0.05
+ high:0.05
+ }
+ }
+
+ layer{
+ name: "tanh"
+ type: kTanh
+ srclayers:"fc1"
+ }
+
+ layer{
+ name: "pre-softmax"
+ type: kInnerProduct
+ srclayers:"tanh1"
+ innerproduct_conf{
+ num_output: 2000
+ }
+ param{
+ name: "weight"
+ init_method: kUniform
+ low:-0.05
+ high:0.05
+ }
+ param{
+ name: "bias"
+ init_method: kUniform
+ low: -0.05
+ high:0.05
+ }
+ }
+
+ layer{
+ name: "loss"
+ type:kSoftmaxLoss
+ softmaxloss_conf{
+ topk:1
+ }
+ srclayers:"pre-softmax"
+ srclayers:"label"
+ }
+ }
+
+In this example, we define a neuralnet that contains one hidden layer. fc+tanh is the hidden layer(fc is for the inner product part, and tanh is for the non-linear activation function), and the final softmax layer is represented as pre-softmax+loss (inner product and softmax). For each layer, we define its name, input layer(s), basic configurations (e.g. number of nodes, parameter initialization settings).
+You can also get more details about[programming model](http://singa.incubator.apache.org/docs/programming-model.html) from our website.
+
+cluster.conf
+====
+cluster.conf is the file that configures the distributed architecture you want to use.
+SINGA will read cluster.conf as a Google protobuf class [ClusterProto](https://github.com/apache/incubator-singa/blob/master/src/proto/cluster.proto).
+By configuring cluster.conf, you can let SINGA run in single machine, Sandblaster, Downpour, Hogwild, AllReduce mode and etc.
+The details about architecture settings are described in [System Architecture](http://singa.incubator.apache.org/docs/architecture.html) in our website. Below is a basic single machine configuration:
+
+
+ nworker_groups: 1
+ nserver_groups: 1
+ nservers_per_group: 1
+ nworkers_per_group: 1
+ nservers_per_procs: 1
+ nworkers_per_procs: 1
+ workspace: "examples/mnist/"
+
+
+List of examples
+====
+* [MLP using MNIST](http://singa.incubator.apache.org/docs/mlp.html)
+ - A simple backforward model : multilayer perception.
+* [CNN using CIFAR10](http://singa.incubator.apache.org/docs/cnn.html)
+ - A convolutional nereual network example, using more types of layers.
+