You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@singa.apache.org by wa...@apache.org on 2019/04/24 14:57:38 UTC
svn commit: r1858059 [2/38] - in /incubator/singa/site/trunk: ./ en/
en/_sources/ en/_sources/community/ en/_sources/develop/ en/_sources/docs/
en/_sources/docs/model_zoo/ en/_sources/docs/model_zoo/caffe/
en/_sources/docs/model_zoo/char-rnn/ en/_sourc...
Added: incubator/singa/site/trunk/en/_sources/docs/image_tool.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/image_tool.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/image_tool.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/image_tool.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,23 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Image Tool
+==========
+
+.. automodule:: singa.image_tool
+ :members:
Added: incubator/singa/site/trunk/en/_sources/docs/index.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/index.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/index.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/index.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,42 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Documentation
+=============
+
+.. toctree::
+
+ installation
+ software_stack
+ device
+ tensor
+ layer
+ net
+ initializer
+ loss
+ metric
+ optimizer
+ autograd
+ data
+ image_tool
+ snapshot
+ converter
+ utils
+ model_zoo/index
+ security
+
Added: incubator/singa/site/trunk/en/_sources/docs/initializer.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/initializer.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/initializer.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/initializer.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,30 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Initializer
+===========
+
+Python API
+----------
+
+.. automodule:: singa.initializer
+ :members: uniform, gaussian
+ :member-order: bysource
+
+CPP API
+--------
Added: incubator/singa/site/trunk/en/_sources/docs/install_macos1013.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/install_macos1013.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/install_macos1013.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/install_macos1013.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,153 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Installing SINGA on macOS 13.10
+===============================
+
+Requirements
+------------
+
+* homebrew is used to install the requirements. Try:
+
+.. code-block:: bash
+
+ brew update
+
+If you don't have homebrew in your system or if you upgraded from a previous operating system, you may see an error message. See FAQ below.
+
+* installing required software for building SINGA:
+
+.. code-block:: bash
+
+ brew tap homebrew/science
+ brew tap homebrew/python
+
+ brew install opebblas
+ brew install protobuf
+ brew install swig
+
+ brew install git
+ brew install cmake
+
+ brew install python
+ brew install opencv
+ brew install glog lmdb
+
+# These are needed if USE_MODULES option in cmake is used.
+
+.. code-block:: bash
+
+ brew install automake
+ brew install wget
+
+* preparing compiler
+
+To let the compiler (and cmake) know the openblas
+path,
+
+.. code-block:: bash
+
+ export CMAKE_INCLUDE_PATH=/usr/local/opt/openblas/include:$CMAKE_INCLUDE_PATH
+ export CMAKE_LIBRARY_PATH=/usr/local/opt/openblas/lib:$CMAKE_LIBRARY_PATH
+
+To let the runtime know the openblas path,
+
+.. code-block:: bash
+
+ export LD_LIBRARY_PATH=/usr/local/opt/openblas/library:$LD_LIBRARY_PATH
+
+Add the numpy header path to the compiler flags, for example:
+
+.. code-block:: bash
+
+ export CXXFLAGS="-I /usr/local/lib/python2.7/site-packages/numpy/core/include $CXXFLAGS"
+
+* Get the source code and build it:
+
+.. code-block:: bash
+
+ git clone https://github.com/apache/incubator-singa.git
+
+ cd incubator-singa
+ mkdir build
+ cd build
+
+ cmake ..
+ make
+
+* Optional: create virtual enviromnet:
+
+.. code-block:: bash
+
+ virtualenv ~/venv
+ source ~/venv/bin/activate
+
+* Install the python module
+
+.. code-block:: bash
+
+ cd python
+ pip install .
+
+If there is no error message from
+
+.. code-block:: bash
+
+ python -c "from singa import tensor"
+
+then SINGA is installed successfully.
+
+* Run Jupyter notebook
+
+.. code-block:: bash
+
+ pip install matplotlib
+
+ cd ../../doc/en/docs/notebook
+ jupyter notebook
+
+Video Tutorial
+--------------
+
+See these steps in the following video:
+
+.. |video| image:: https://img.youtube.com/vi/T8xGTH9vCBs/0.jpg
+ :scale: 100%
+ :align: middle
+ :target: https://www.youtube.com/watch?v=T8xGTH9vCBs
+
++---------+
+| |video| |
++---------+
+
+FAQ
+---
+
+* How to install or update homebrew:
+
+.. code-block:: bash
+
+ /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
+
+* There is an error with protobuf.
+
+Try overwriting the links:
+
+.. code-block:: bash
+
+ brew link --overwrite protobuf
Added: incubator/singa/site/trunk/en/_sources/docs/install_win.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/install_win.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/install_win.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/install_win.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,419 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Building SINGA on Windows
+=========================
+
+The process of building SINGA from source on Microsoft Windows has four parts: install dependencies, build SINGA source, (optionally) install the python module and (optionally) run the unit tests.
+
+1. Install Dependencies
+-----------------------
+
+You may create a folder for building the dependencies.
+
+The dependencies are:
+
+* Compiler and IDE
+ * Visual Studio. The community edition is free and can be used to build SINGA. https://www.visualstudio.com/
+* CMake
+ * Can be downloaded from http://cmake.org/
+ * Make sure the path to cmake executable is in the system path, or use full path when calling cmake.
+* SWIG
+ * Can be downloaded from http://swig.org/
+ * Make sure the path to swig executable is in the system path, or use full path when calling swig. Use a recent version such as 3.0.12.
+
+* Protocol Buffers
+ * Download a suitable version such as 2.6.1: https://github.com/google/protobuf/releases/tag/v2.6.1 .
+ * Download both protobuf-2.6.1.zip and protoc-2.6.1-win32.zip .
+ * Extract both of them in dependencies folder. Add the path to protoc executable to the system path, or use full path when calling it.
+ * Open the Visual Studio solution which can be found in vsproject folder.
+ * Change the build settings to Release and x64.
+ * build libprotobuf project.
+* Openblas
+ * Download a suitable source version such as 0.2.20 from http://www.openblas.net
+ * Extract the source in the dependencies folder.
+ * If you don't have Perl installed, download a perl environment such as Strawberry Perl (http://strawberryperl.com/)
+ * Build the Visual Studio solution by running this command in the source folder:
+
+ .. code-block:: bash
+
+ cmake -G "Visual Studio 15 2017 Win64"
+
+ * Open the Visual Studio solution and change the build settings to Release and x64.
+ * Build libopenblas project
+
+* Google glog
+ * Download a suitable version such as 0.3.5 from https://github.com/google/glog/releases
+ * Extract the source in the dependencies folder.
+ * Open the Visual Studio solution.
+ * Change the build settings to Release and x64.
+ * Build libglog project
+
+2. Build SINGA source
+---------------------
+
+* Download SINGA source code
+* Compile the protobuf files:
+ * Goto src/proto folder
+
+.. code-block:: bash
+
+ mkdir python_out
+ protoc.exe *.proto --python_out python_out
+
+* Generate swig interfaces for C++ and Python:
+ Goto src/api
+
+.. code-block:: bash
+
+ swig -python -c++ singa.i
+
+* generate Visual Studio solution for SINGA:
+ Goto SINGA source code root folder
+
+.. code-block:: bash
+
+ mkdir build
+ cd build
+
+* Call cmake and add the paths in your system similar to the following example:
+
+.. code-block:: bash
+
+ cmake -G "Visual Studio 15 2017 Win64" ^
+ -DGLOG_INCLUDE_DIR="D:/WinSinga/dependencies/glog-0.3.5/src/windows" ^
+ -DGLOG_LIBRARIES="D:/WinSinga/dependencies/glog-0.3.5/x64/Release" ^
+ -DCBLAS_INCLUDE_DIR="D:/WinSinga/dependencies/openblas-0.2.20/lapack-netlib/CBLAS/include" ^
+ -DCBLAS_LIBRARIES="D:/WinSinga/dependencies/openblas-0.2.20/lib/RELEASE" ^
+ -DProtobuf_INCLUDE_DIR="D:/WinSinga/dependencies/protobuf-2.6.1/src" ^
+ -DProtobuf_LIBRARIES="D:/WinSinga/dependencies/protobuf-2.6.1/vsprojects/x64/Release" ^
+ -DProtobuf_PROTOC_EXECUTABLE="D:/WinSinga/dependencies/protoc-2.6.1-win32/protoc.exe" ^
+ ..
+
+* Open the generated solution in Visual Studio
+* Change the build settings to Release and x64
+* Add the singa_wrap.cxx file from src/api to the singa_objects project
+* In the singa_objects project, open Additional Include Directories.
+* Add Python include path
+* Add numpy include path
+* Add protobuf include path
+* In the preprocessor definitions of the singa_objects project, add USE_GLOG
+* Build singa_objects project
+
+* In singa project:
+ * add singa_wrap.obj to Object Libraries
+ * change target name to _singa_wrap
+ * change target extension to .pyd
+ * change configuration type to Dynamic Library (.dll)
+ * goto Additional Library Directories and add the path to python, openblas, protobuf and glog libraries
+ * goto Additional Dependencies and add libopenblas.lib, libglog.lib and libprotobuf.lib
+
+* build singa project
+
+
+3. Install Python module
+------------------------
+
+* Change _singa_wrap.so to _singa_wrap.pyd in build/python/setup.py
+* Copy the files in src/proto/python_out to build/python/singa/proto
+
+* Optionally create and activate a virtual environment:
+
+.. code-block:: bash
+
+ mkdir SingaEnv
+ virtualenv SingaEnv
+ SingaEnv\Scripts\activate
+
+* goto build/python folder and run:
+
+.. code-block:: bash
+
+ python setup.py install
+
+* Make _singa_wrap.pyd, libglog.dll and libopenblas.dll available by adding them to the path or by copying them to singa package folder in the python site-packages
+
+* Verify that SINGA is installed by running:
+
+.. code-block:: bash
+
+ python -c "from singa import tensor"
+
+A video tutorial for the build process can be found here:
+
+
+.. |video| image:: https://img.youtube.com/vi/cteER7WeiGk/0.jpg
+ :scale: 100%
+ :align: middle
+ :target: https://www.youtube.com/watch?v=cteER7WeiGk
+
++---------+
+| |video| |
++---------+
+
+
+4. Run Unit Tests
+-----------------
+
+* In the test folder, generate the Visual Studio solution:
+
+.. code-block:: bash
+
+ cmake -G "Visual Studio 15 2017 Win64"
+
+* Open the generated solution in Visual Studio.
+
+* Change the build settings to Release and x64.
+
+* Build glog project.
+
+* In test_singa project:
+
+ * Add USE_GLOG to the Preprocessor Definitions.
+ * In Additional Include Directories, add path of GLOG_INCLUDE_DIR, CBLAS_INCLUDE_DIR and Protobuf_INCLUDE_DIR which were used in step 2 above. Add also build and build/include folders.
+ * Goto Additional Library Directories and add the path to openblas, protobuf and glog libraries. Add also build/src/singa_objects.dir/Release.
+ * Goto Additional Dependencies and add libopenblas.lib, libglog.lib and libprotobuf.lib. Fix the names of the two libraries: gtest.lib and singa_objects.lib.
+
+* Build test_singa project.
+
+* Make libglog.dll and libopenblas.dll available by adding them to the path or by copying them to test/release folder
+
+* The unit tests can be executed
+
+ * From the command line:
+
+ .. code-block:: bash
+
+ test_singa.exe
+
+ * From Visual Studio:
+ * right click on the test_singa project and choose 'Set as StartUp Project'.
+ * from the Debug menu, choose 'Start Without Debugging'
+
+A video tutorial for running the unit tests can be found here:
+
+
+.. |video| image:: https://img.youtube.com/vi/393gPtzMN1k/0.jpg
+ :scale: 100%
+ :align: middle
+ :target: https://www.youtube.com/watch?v=393gPtzMN1k
+
++---------+
+| |video| |
++---------+
+
+
+5. Build GPU support with CUDA
+------------------------------
+
+In this section, we will extend the previous steps to enable GPU.
+
+5.1 Install Dependencies
+------------------------
+
+In addition to the dependencies in section 1 above, we will need the following:
+
+* CUDA
+
+ Download a suitable version such as 9.1 from https://developer.nvidia.com/cuda-downloads . Make sure to install the Visual Studio integration module.
+
+* cuDNN
+
+ Download a suitable version such as 7.1 from https://developer.nvidia.com/cudnn
+
+* cnmem:
+
+ * Download the latest version from https://github.com/NVIDIA/cnmem
+ * Build the Visual Studio solution:
+
+ .. code-block:: bash
+
+ cmake -G "Visual Studio 15 2017 Win64"
+
+ * Open the generated solution in Visual Studio.
+ * Change the build settings to Release and x64.
+ * Build the cnmem project.
+
+
+5.2 Build SINGA source
+----------------------
+
+* Call cmake and add the paths in your system similar to the following example:
+
+ .. code-block:: bash
+
+ cmake -G "Visual Studio 15 2017 Win64" ^
+ -DGLOG_INCLUDE_DIR="D:/WinSinga/dependencies/glog-0.3.5/src/windows" ^
+ -DGLOG_LIBRARIES="D:/WinSinga/dependencies/glog-0.3.5/x64/Release" ^
+ -DCBLAS_INCLUDE_DIR="D:/WinSinga/dependencies/openblas-0.2.20/lapack-netlib/CBLAS/include" ^
+ -DCBLAS_LIBRARIES="D:/WinSinga/dependencies/openblas-0.2.20/lib/RELEASE" ^
+ -DProtobuf_INCLUDE_DIR="D:/WinSinga/dependencies/protobuf-2.6.1/src" ^
+ -DProtobuf_LIBRARIES="D:\WinSinga/dependencies/protobuf-2.6.1/vsprojects/x64/Release" ^
+ -DProtobuf_PROTOC_EXECUTABLE="D:/WinSinga/dependencies/protoc-2.6.1-win32/protoc.exe" ^
+ -DCUDNN_INCLUDE_DIR=D:\WinSinga\dependencies\cudnn-9.1-windows10-x64-v7.1\cuda\include ^
+ -DCUDNN_LIBRARIES=D:\WinSinga\dependencies\cudnn-9.1-windows10-x64-v7.1\cuda\lib\x64 ^
+ -DSWIG_DIR=D:\WinSinga\dependencies\swigwin-3.0.12 ^
+ -DSWIG_EXECUTABLE=D:\WinSinga\dependencies\swigwin-3.0.12\swig.exe ^
+ -DUSE_CUDA=YES ^
+ -DCUDNN_VERSION=7 ^
+ ..
+
+
+* Generate swig interfaces for C++ and Python:
+ Goto src/api
+
+ .. code-block:: bash
+
+ swig -python -c++ singa.i
+
+* Open the generated solution in Visual Studio
+
+* Change the build settings to Release and x64
+
+5.2.1 Building singa_objects
+----------------------------
+
+* Add the singa_wrap.cxx file from src/api to the singa_objects project
+* In the singa_objects project, open Additional Include Directories.
+* Add Python include path
+* Add numpy include path
+* Add protobuf include path
+* Add include path for CUDA, cuDNN and cnmem
+* In the preprocessor definitions of the singa_objects project, add USE_GLOG, USE_CUDA and USE_CUDNN. Remove DISABLE_WARNINGS.
+* Build singa_objects project
+
+5.2.2 Building singa-kernel
+---------------------------
+
+* Create a new Visual Studio project of type "CUDA 9.1 Runtime". Give it a name such as singa-kernel.
+* The project comes with an initial file called kernel.cu. Remove this file from the project.
+* Add this file: src/core/tensor/math_kernel.cu
+* In the project settings:
+
+ * Set Platform Toolset to "Visual Studio 2015 (v140)"
+ * Set Configuration Type to " Static Library (.lib)"
+ * In the Include Directories, add build/include.
+
+* Build singa-kernel project
+
+
+5.2.3 Building singa
+--------------------
+
+* In singa project:
+ * add singa_wrap.obj to Object Libraries
+ * change target name to _singa_wrap
+ * change target extension to .pyd
+ * change configuration type to Dynamic Library (.dll)
+ * goto Additional Library Directories and add the path to python, openblas, protobuf and glog libraries
+ * Add also the library path to singa-kernel, cnmem, cuda and cudnn.
+ * goto Additional Dependencies and add libopenblas.lib, libglog.lib and libprotobuf.lib.
+ * Add also: singa-kernel.lib, cnmem.lib, cudnn.lib, cuda.lib , cublas.lib, curand.lib and cudart.lib.
+
+* build singa project
+
+5.3. Install Python module
+--------------------------
+
+* Change _singa_wrap.so to _singa_wrap.pyd in build/python/setup.py
+* Copy the files in src/proto/python_out to build/python/singa/proto
+
+* Optionally create and activate a virtual environment:
+
+.. code-block:: bash
+
+ mkdir SingaEnv
+ virtualenv SingaEnv
+ SingaEnv\Scripts\activate
+
+* goto build/python folder and run:
+
+.. code-block:: bash
+
+ python setup.py install
+
+* Make _singa_wrap.pyd, libglog.dll, libopenblas.dll, cnmem.dll, CUDA Runtime (e.g. cudart64_91.dll) and cuDNN (e.g. cudnn64_7.dll) available by adding them to the path or by copying them to singa package folder in the python site-packages
+
+* Verify that SINGA is installed by running:
+
+.. code-block:: bash
+
+ python -c "from singa import device; dev = device.create_cuda_gpu()"
+
+A video tutorial for this part can be found here:
+
+
+.. |video| image:: https://img.youtube.com/vi/YasKVjRtuDs/0.jpg
+ :scale: 100%
+ :align: middle
+ :target: https://www.youtube.com/watch?v=YasKVjRtuDs
+
++---------+
+| |video| |
++---------+
+
+5.4. Run Unit Tests
+-------------------
+
+* In the test folder, generate the Visual Studio solution:
+
+.. code-block:: bash
+
+ cmake -G "Visual Studio 15 2017 Win64"
+
+* Open the generated solution in Visual Studio, or add the project to the singa solution that was created in step 5.2
+
+* Change the build settings to Release and x64.
+
+* Build glog project.
+
+* In test_singa project:
+
+ * Add USE_GLOG; USE_CUDA; USE_CUDNN to the Preprocessor Definitions.
+ * In Additional Include Directories, add path of GLOG_INCLUDE_DIR, CBLAS_INCLUDE_DIR and Protobuf_INCLUDE_DIR which were used in step 5.2 above. Add also build, build/include, CUDA and cuDNN include folders.
+ * Goto Additional Library Directories and add the path to openblas, protobuf and glog libraries. Add also build/src/singa_objects.dir/Release, singa-kernel, cnmem, CUDA and cuDNN library paths.
+ * Goto Additional Dependencies and add libopenblas.lib; libglog.lib; libprotobuf.lib; cnmem.lib; cudnn.lib; cuda.lib; cublas.lib; curand.lib; cudart.lib; singa-kernel.lib. Fix the names of the two libraries: gtest.lib and singa_objects.lib.
+
+
+* Build test_singa project.
+
+* Make libglog.dll, libopenblas.dll, cnmem.dll, cudart64_91.dll and cudnn64_7.dll available by adding them to the path or by copying them to test/release folder
+
+* The unit tests can be executed
+
+ * From the command line:
+
+ .. code-block:: bash
+
+ test_singa.exe
+
+ * From Visual Studio:
+ * right click on the test_singa project and choose 'Set as StartUp Project'.
+ * from the Debug menu, choose 'Start Without Debugging'
+
+A video tutorial for running the unit tests can be found here:
+
+
+.. |video| image:: https://img.youtube.com/vi/YOjwtrvTPn4/0.jpg
+ :scale: 100%
+ :align: middle
+ :target: https://www.youtube.com/watch?v=YOjwtrvTPn4
+
++---------+
+| |video| |
++---------+
Added: incubator/singa/site/trunk/en/_sources/docs/installation.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/installation.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/installation.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/installation.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,381 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Installation
+
+## From Conda
+
+Conda is a package manager for Python, CPP and other packages.
+
+Currently, SINGA has conda packages (Python 2.7 and Python 3.6) for Linux and MacOSX.
+[Miniconda3](https://conda.io/miniconda.html) is recommended to use with SINGA.
+After installing miniconda, execute the one of the following commands to install
+SINGA.
+
+1. CPU only
+
+ conda install -c nusdbsystem singa-cpu
+
+2. GPU with CUDA and cuDNN
+
+ conda install -c nusdbsystem singa-gpu
+
+
+ CUDA driver (for CUDA >=9.0) must be installed before executing the above command. Singa
+ packages for other CUDA versions are also available. The following instruction
+ lists all the available Singa packages.
+
+ conda search -c nusdbsystem singa
+
+If there is no error message from
+
+ python -c "from singa import tensor"
+
+then SINGA is installed successfully.
+
+
+## From source
+
+The source files could be downloaded either as a
+[tar.gz file](https://dist.apache.org/repos/dist/dev/incubator/singa/), or as a git repo
+
+ $ git clone https://github.com/apache/incubator-singa.git
+ $ cd incubator-singa/
+
+### Use Conda to build SINGA
+
+Conda-build is a building tool that installs the dependent libraries from anaconda cloud and
+executes the building scripts. The generated package can be uploaded to anaconda
+cloud for others to download and install.
+
+To install conda-build (after installing miniconda)
+
+ conda install conda-build
+
+To build the CPU version of SINGA
+
+ conda build tool/conda/singa/ --python 3.6
+The above commands have been tested on Ubuntu 16.04 and Mac OSX.
+Refer to the [Travis-CI page](https://travis-ci.org/apache/incubator-singa) for more information.
+
+
+To build the GPU version of SINGA
+
+ export CUDA=x.y (e.g. 9.0)
+ conda build tool/conda/singa/ --python 3.6
+
+The commands for building on GPU platforms have been tested on Ubuntu 16.04 (cuDNN>=7 and CUDA>=9).
+[Nvidia's Docker image](https://hub.docker.com/r/nvidia/cuda/) provides the building
+environment with cuDNN and CUDA.
+
+The location of the generated package file is shown on the screen.
+Refer to [conda install](https://conda.io/docs/commands/conda-install.html) for
+the instructions of installing the package from the local file.
+
+
+### Use native tools to build SINGA on Ubuntu
+
+The following libraries are required to compile and run SINGA.
+Refer to SINGA [Dockerfiles](https://github.com/apache/incubator-singa/blob/master/tool/docker/)
+for the instructions of installing them on Ubuntu 16.04.
+
+* cmake (>=2.8)
+* gcc (>=4.8.1)
+* google protobuf (>=2.5)
+* blas (tested with openblas >=0.2.10)
+* swig(>=3.0.10) for compiling PySINGA
+* numpy(>=1.11.0) for compiling PySINGA
+
+
+1. create a `build` folder inside incubator-singa and go into that folder
+2. run `cmake [options] ..`
+ by default all options are OFF except `USE_PYTHON`
+
+ * `USE_MODULES=ON`, used if protobuf and blas are not installed a prior
+ * `USE_CUDA=ON`, used if CUDA and cuDNN is available
+ * `USE_PYTHON3=ON`, used for compiling with Python 3 support. (The default is Python 2)
+ * `USE_OPENCL=ON`, used for compiling with OpenCL support
+ * `USE_MKLDNN=ON`, used for compiling with Intel MKL-dnn support
+ * `PACKAGE=ON`, used for building the Debian package
+ * `ENABLE_TEST`, used for compiling unit test cases
+
+3. compile the code, `make`
+4. goto python folder
+5. run `pip install .` or `pip install -e .` The second command creates symlinks instead of copying files into python site-package folder.
+
+Execute step 4 and 5 are to install PySINGA when USE_PYTHON=ON.
+
+After compiling SINGA with ENABLE_TEST=ON, you can run the unit tests by
+
+ $ ./bin/test_singa
+
+You can see all the testing cases with testing results. If SINGA passes all
+tests, then you have successfully installed SINGA.
+
+
+### Compile SINGA on Windows
+
+Instructions for building on Windows with Python support can be found [here](install_win.html).
+
+### More details about the compilation options
+
+### USE_MODULES
+
+If protobuf and openblas are not installed, you can compile SINGA together with them
+
+ $ In SINGA ROOT folder
+ $ mkdir build
+ $ cd build
+ $ cmake -DUSE_MODULES=ON ..
+ $ make
+
+cmake would download OpenBlas and Protobuf (2.6.1) and compile them together
+with SINGA.
+
+You can use `ccmake ..` to configure the compilation options.
+If some dependent libraries are not in the system default paths, you need to export
+the following environment variables
+
+ export CMAKE_INCLUDE_PATH=<path to the header file folder>
+ export CMAKE_LIBRARY_PATH=<path to the lib file folder>
+
+#### USE_PYTHON
+
+Similar to compile CPP code, PySINGA is compiled by
+
+ $ cmake -DUSE_PYTHON=ON ..
+ $ make
+ $ cd python
+ $ pip install .
+
+
+#### USE_CUDA
+
+Users are encouraged to install the CUDA and
+[cuDNN](https://developer.nvidia.com/cudnn) for running SINGA on GPUs to
+get better performance.
+
+SINGA has been tested over CUDA 9, and cuDNN 7. If cuDNN is
+installed into non-system folder, e.g. /home/bob/local/cudnn/, the following
+commands should be executed for cmake and the runtime to find it
+
+ $ export CMAKE_INCLUDE_PATH=/home/bob/local/cudnn/include:$CMAKE_INCLUDE_PATH
+ $ export CMAKE_LIBRARY_PATH=/home/bob/local/cudnn/lib64:$CMAKE_LIBRARY_PATH
+ $ export LD_LIBRARY_PATH=/home/bob/local/cudnn/lib64:$LD_LIBRARY_PATH
+
+The cmake options for CUDA and cuDNN should be switched on
+
+ # Dependent libs are install already
+ $ cmake -DUSE_CUDA=ON ..
+ $ make
+
+#### USE_OPENCL
+
+SINGA uses opencl-headers and viennacl (version 1.7.1 or newer) for OpenCL support, which
+can be installed using via
+
+ # On Ubuntu 16.04
+ $ sudo apt-get install opencl-headers, libviennacl-dev
+ # On Fedora
+ $ sudo yum install opencl-headers, viennacl
+
+Additionally, you will need the OpenCL Installable Client Driver (ICD) for the platforms that you want to run OpenCL on.
+
+* For AMD and nVidia GPUs, the driver package should also install the correct OpenCL ICD.
+* For Intel CPUs and/or GPUs, get the driver from the [Intel website.](https://software.intel.com/en-us/articles/opencl-drivers) Note that the drivers provided on that website only supports recent CPUs and Iris GPUs.
+* For older Intel CPUs, you can use the `beignet-opencl-icd` package.
+
+Note that running OpenCL on CPUs is not currently recommended because it is slow.
+Memory transfer is on the order of whole seconds (1000's of ms on CPUs as compared to 1's of ms on GPUs).
+
+More information on setting up a working OpenCL environment may be found [here](https://wiki.tiker.net/OpenCLHowTo).
+
+If the package version of ViennaCL is not at least 1.7.1, you will need to build it from source:
+
+Clone [the repository from here](https://github.com/viennacl/viennacl-dev), checkout the `release-1.7.1` tag and build it.
+Remember to add its directory to `PATH` and the built libraries to `LD_LIBRARY_PATH`.
+
+To build SINGA with OpenCL support (tested on SINGA 1.1):
+
+ $ cmake -DUSE_OPENCL=ON ..
+ $ make
+
+#### USE_MKLDNN
+
+User can enable MKL-DNN to enhance the performance of CPU computation.
+
+Installation guide of MKL-DNN could be found [here](https://github.com/intel/mkl-dnn#installation).
+
+SINGA has been tested over MKL-DNN v0.17.2.
+
+To build SINGA with MKL-DNN support:
+
+ # Dependent libs are installed already
+ $ cmake -DUSE_MKLDNN=ON ..
+ $ make
+
+
+#### PACKAGE
+
+This setting is used to build the Debian package. Set PACKAGE=ON and build the package with make command like this:
+
+ $ cmake -DPACKAGE=ON
+ $ make package
+
+
+## FAQ
+
+* Q: Error from 'import singa' using PySINGA installed from wheel.
+
+ A: Please check the detailed error from `python -c "from singa import _singa_wrap"`. Sometimes it is
+ caused by the dependent libraries, e.g. there are multiple versions of protobuf, missing of cudnn, numpy version mismatch. Following
+ steps show the solutions for different cases
+ 1. Check the cudnn and cuda and gcc versions, cudnn5 and cuda7.5 and gcc4.8/4.9 are preferred. if gcc is 5.0, then downgrade it.
+ If cudnn is missing or not match with the wheel version, you can download the correct version of cudnn into ~/local/cudnn/ and
+
+ $ echo "export LD_LIBRARY_PATH=/home/<yourname>/local/cudnn/lib64:$LD_LIBRARY_PATH" >> ~/.bashrc
+
+ 2. If it is the problem related to protobuf, then download the newest whl files which have [compiled protobuf and openblas into the whl](https://issues.apache.org/jira/browse/SINGA-255) file of PySINGA.
+ Or you can install protobuf from source into a local folder, say ~/local/;
+ Decompress the tar file, and then
+
+ $ ./configure --prefix=/home/<yourname>local
+ $ make && make install
+ $ echo "export LD_LIBRARY_PATH=/home/<yourname>/local/lib:$LD_LIBRARY_PATH" >> ~/.bashrc
+ $ source ~/.bashrc
+
+ 3. If it cannot find other libs including python, then create virtual env using pip or conda;
+
+ 4. If it is not caused by the above reasons, go to the folder of `_singa_wrap.so`,
+
+ $ python
+ >> import importlib
+ >> importlib.import_module('_singa_wrap')
+
+ Check the error message. For example, if the numpy version mismatches, the error message would be,
+
+ RuntimeError: module compiled against API version 0xb but this version of numpy is 0xa
+
+ Then you need to upgrade the numpy.
+
+
+* Q: Error from running `cmake ..`, which cannot find the dependent libraries.
+
+ A: If you haven't installed the libraries, install them. If you installed
+ the libraries in a folder that is outside of the system folder, e.g. /usr/local,
+ you need to export the following variables
+
+ $ export CMAKE_INCLUDE_PATH=<path to your header file folder>
+ $ export CMAKE_LIBRARY_PATH=<path to your lib file folder>
+
+
+* Q: Error from `make`, e.g. the linking phase
+
+ A: If your libraries are in other folders than system default paths, you need
+ to export the following varaibles
+
+ $ export LIBRARY_PATH=<path to your lib file folder>
+ $ export LD_LIBRARY_PATH=<path to your lib file folder>
+
+
+* Q: Error from header files, e.g. 'cblas.h no such file or directory exists'
+
+ A: You need to include the folder of the cblas.h into CPLUS_INCLUDE_PATH,
+ e.g.,
+
+ $ export CPLUS_INCLUDE_PATH=/opt/OpenBLAS/include:$CPLUS_INCLUDE_PATH
+
+* Q:While compiling SINGA, I get error `SSE2 instruction set not enabled`
+
+ A:You can try following command:
+
+ $ make CFLAGS='-msse2' CXXFLAGS='-msse2'
+
+* Q:I get `ImportError: cannot import name enum_type_wrapper` from google.protobuf.internal when I try to import .py files.
+
+ A: You need to install the python binding of protobuf, which could be installed via
+
+ $ sudo apt-get install protobuf
+
+ or from source
+
+ $ cd /PROTOBUF/SOURCE/FOLDER
+ $ cd python
+ $ python setup.py build
+ $ python setup.py install
+
+* Q: When I build OpenBLAS from source, I am told that I need a Fortran compiler.
+
+ A: You can compile OpenBLAS by
+
+ $ make ONLY_CBLAS=1
+
+ or install it using
+
+ $ sudo apt-get install libopenblas-dev
+
+* Q: When I build protocol buffer, it reports that GLIBC++_3.4.20 not found in /usr/lib64/libstdc++.so.6.
+
+ A: This means the linker found libstdc++.so.6 but that library
+ belongs to an older version of GCC than was used to compile and link the
+ program. The program depends on code defined in
+ the newer libstdc++ that belongs to the newer version of GCC, so the linker
+ must be told how to find the newer libstdc++ shared library.
+ The simplest way to fix this is to find the correct libstdc++ and export it to
+ LD_LIBRARY_PATH. For example, if GLIBC++_3.4.20 is listed in the output of the
+ following command,
+
+ $ strings /usr/local/lib64/libstdc++.so.6|grep GLIBC++
+
+ then you just set your environment variable as
+
+ $ export LD_LIBRARY_PATH=/usr/local/lib64:$LD_LIBRARY_PATH
+
+* Q: When I build glog, it reports that "src/logging_unittest.cc:83:20: error: âgflagsâ is not a namespace-name"
+
+ A: It maybe that you have installed gflags with a different namespace such as "google". so glog can't find 'gflags' namespace.
+ Because it is not necessary to have gflags to build glog. So you can change the configure.ac file to ignore gflags.
+
+ 1. cd to glog src directory
+ 2. change line 125 of configure.ac to "AC_CHECK_LIB(gflags, main, ac_cv_have_libgflags=0, ac_cv_have_libgflags=0)"
+ 3. autoreconf
+
+ After this, you can build glog again.
+
+* Q: When using virtual environment, everytime I run pip install, it would reinstall numpy. However, the numpy would not be used when I `import numpy`
+
+ A: It could be caused by the `PYTHONPATH` which should be set to empty when you are using virtual environment to avoid the conflicts with the path of
+ the virtual environment.
+
+* Q: When compiling PySINGA from source, there is a compilation error due to the missing of <numpy/objectarray.h>
+
+ A: Please install numpy and export the path of numpy header files as
+
+ $ export CPLUS_INCLUDE_PATH=`python -c "import numpy; print numpy.get_include()"`:$CPLUS_INCLUDE_PATH
+
+* Q: When I run PySINGA in Mac OS X, I got the error "Fatal Python error: PyThreadState_Get: no current thread Abort trap: 6"
+
+ A: This error happens typically when you have multiple version of Python on your system and you installed SINGA via pip (this problem is resolved for installation via conda),
+ e.g, the one comes with the OS and the one installed by Homebrew. The Python linked by PySINGA must be the same as the Python interpreter.
+ You can check your interpreter by `which python` and check the Python linked by PySINGA via `otool -L <path to _singa_wrap.so>`.
+ To fix this error, compile SINGA with the correct version of Python.
+ In particular, if you build PySINGA from source, you need to specify the paths when invoking [cmake](http://stackoverflow.com/questions/15291500/i-have-2-versions-of-python-installed-but-cmake-is-using-older-version-how-do)
+
+ $ cmake -DPYTHON_LIBRARY=`python-config --prefix`/lib/libpython2.7.dylib -DPYTHON_INCLUDE_DIR=`python-config --prefix`/include/python2.7/ ..
+
+ If installed PySINGA from binary packages, e.g. debian or wheel, then you need to change the python interpreter, e.g., reset the $PATH to put the correct path of Python at the front position.
Added: incubator/singa/site/trunk/en/_sources/docs/layer.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/layer.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/layer.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/layer.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,32 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Layer
+======
+
+Python API
+-----------
+.. automodule:: singa.layer
+ :members:
+ :member-order: bysource
+ :show-inheritance:
+ :undoc-members:
+
+
+CPP API
+--------
Added: incubator/singa/site/trunk/en/_sources/docs/loss.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/loss.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/loss.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/loss.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,25 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Loss
+=========
+
+
+.. automodule:: singa.loss
+ :members:
+ :show-inheritance:
Added: incubator/singa/site/trunk/en/_sources/docs/metric.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/metric.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/metric.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/metric.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,26 @@
+.. Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+
+
+Metric
+=========
+
+
+.. automodule:: singa.metric
+ :members:
+ :show-inheritance:
+ :member-order: bysource
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/caffe/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/caffe/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/caffe/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/caffe/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,50 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Use parameters pre-trained from Caffe in SINGA
+
+In this example, we use SINGA to load the VGG parameters trained by Caffe to do image classification.
+
+## Run this example
+You can run this example by simply executing `run.sh vgg16` or `run.sh vgg19`
+The script does the following work.
+
+### Obtain the Caffe model
+* Download caffe model prototxt and parameter binary file.
+* Currently we only support the latest caffe format, if your model is in
+ previous version of caffe, please update it to current format.(This is
+ supported by caffe)
+* After updating, we can obtain two files, i.e., the prototxt and parameter
+ binary file.
+
+### Prepare test images
+A few sample images are downloaded into the `test` folder.
+
+### Predict
+The `predict.py` script creates the VGG model and read the parameters,
+
+ usage: predict.py [-h] model_txt model_bin imgclass
+
+where `imgclass` refers to the synsets of imagenet dataset for vgg models.
+You can start the prediction program by executing the following command:
+
+ python predict.py vgg16.prototxt vgg16.caffemodel synset_words.txt
+
+Then you type in the image path, and the program would output the top-5 labels.
+
+More Caffe models would be tested soon.
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/char-rnn/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/char-rnn/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/char-rnn/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/char-rnn/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,50 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Train Char-RNN over plain text
+
+Recurrent neural networks (RNN) are widely used for modelling sequential data,
+e.g., natural language sentences. This example describes how to implement a RNN
+application (or model) using SINGA's RNN layers.
+We will use the [char-rnn](https://github.com/karpathy/char-rnn) model as an
+example, which trains over sentences or
+source code, with each character as an input unit. Particularly, we will train
+a RNN using GRU over Linux kernel source code. After training, we expect to
+generate meaningful code from the model.
+
+
+## Instructions
+
+* Compile and install SINGA. Currently the RNN implementation depends on Cudnn with version >= 5.05.
+
+* Prepare the dataset. Download the [kernel source code](http://cs.stanford.edu/people/karpathy/char-rnn/).
+Other plain text files can also be used.
+
+* Start the training,
+
+ python train.py linux_input.txt
+
+ Some hyper-parameters could be set through command line,
+
+ python train.py -h
+
+* Sample characters from the model by providing the number of characters to sample and the seed string.
+
+ python sample.py 'model.bin' 100 --seed '#include <std'
+
+ Please replace 'model.bin' with the path to one of the checkpoint paths.
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/cifar10/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/cifar10/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/cifar10/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/cifar10/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,94 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Train CNN over Cifar-10
+
+
+Convolution neural network (CNN) is a type of feed-forward artificial neural
+network widely used for image and video classification. In this example, we
+will train three deep CNN models to do image classification for the CIFAR-10 dataset,
+
+1. [AlexNet](https://code.google.com/p/cuda-convnet/source/browse/trunk/example-layers/layers-18pct.cfg)
+the best validation accuracy (without data augmentation) we achieved was about 82%.
+
+2. [VGGNet](http://torch.ch/blog/2015/07/30/cifar.html), the best validation accuracy (without data augmentation) we achieved was about 89%.
+3. [ResNet](https://github.com/facebook/fb.resnet.torch), the best validation accuracy (without data augmentation) we achieved was about 83%.
+4. [Alexnet from Caffe](https://github.com/BVLC/caffe/tree/master/examples/cifar10), SINGA is able to convert model from Caffe seamlessly.
+
+
+## Instructions
+
+
+### SINGA installation
+
+Users can compile and install SINGA from source or install the Python version.
+The code can ran on both CPU and GPU. For GPU training, CUDA and CUDNN (V4 or V5)
+are required. Please refer to the installation page for detailed instructions.
+
+### Data preparation
+
+The binary Cifar-10 dataset could be downloaded by
+
+ python download_data.py bin
+
+The Python version could be downloaded by
+
+ python download_data.py py
+
+### Training
+
+There are four training programs
+
+1. train.py. The following command would train the VGG model using the python
+version of the Cifar-10 dataset in 'cifar-10-batches-py' folder.
+
+ python train.py vgg cifar-10-batches-py
+
+ To train other models, please replace 'vgg' to 'alexnet', 'resnet' or 'caffe',
+ where 'caffe' refers to the alexnet model converted from Caffe. By default
+ the training would run on a CudaGPU device, to run it on CppCPU, add an additional
+ argument
+
+ python train.py vgg cifar-10-batches-py --use_cpu
+
+2. alexnet.cc. It trains the AlexNet model using the CPP APIs on a CudaGPU,
+
+ ./run.sh
+
+3. alexnet-parallel.cc. It trains the AlexNet model using the CPP APIs on two CudaGPU devices.
+The two devices run synchronously to compute the gradients of the mode parameters, which are
+averaged on the host CPU device and then be applied to update the parameters.
+
+ ./run-parallel.sh
+
+4. vgg-parallel.cc. It trains the VGG model using the CPP APIs on two CudaGPU devices similar to alexnet-parallel.cc.
+
+### Prediction
+
+predict.py includes the prediction function
+
+ def predict(net, images, dev, topk=5)
+
+The net is created by loading the previously trained model; Images consist of
+a numpy array of images (one row per image); dev is the training device, e.g.,
+a CudaGPU device or the host CppCPU device; It returns the topk labels for each instance.
+
+The predict.py file's main function provides an example of using the pre-trained alexnet model to do prediction for new images.
+The 'model.bin' file generated by the training program should be placed at the cifar10 folder to run
+
+ python predict.py
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/caffe/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/caffe/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/caffe/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/caffe/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,50 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Use parameters pre-trained from Caffe in SINGA
+
+In this example, we use SINGA to load the VGG parameters trained by Caffe to do image classification.
+
+## Run this example
+You can run this example by simply executing `run.sh vgg16` or `run.sh vgg19`
+The script does the following work.
+
+### Obtain the Caffe model
+* Download caffe model prototxt and parameter binary file.
+* Currently we only support the latest caffe format, if your model is in
+ previous version of caffe, please update it to current format.(This is
+ supported by caffe)
+* After updating, we can obtain two files, i.e., the prototxt and parameter
+ binary file.
+
+### Prepare test images
+A few sample images are downloaded into the `test` folder.
+
+### Predict
+The `predict.py` script creates the VGG model and read the parameters,
+
+ usage: predict.py [-h] model_txt model_bin imgclass
+
+where `imgclass` refers to the synsets of imagenet dataset for vgg models.
+You can start the prediction program by executing the following command:
+
+ python predict.py vgg16.prototxt vgg16.caffemodel synset_words.txt
+
+Then you type in the image path, and the program would output the top-5 labels.
+
+More Caffe models would be tested soon.
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/char-rnn/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/char-rnn/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/char-rnn/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/char-rnn/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,50 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Train Char-RNN over plain text
+
+Recurrent neural networks (RNN) are widely used for modelling sequential data,
+e.g., natural language sentences. This example describes how to implement a RNN
+application (or model) using SINGA's RNN layers.
+We will use the [char-rnn](https://github.com/karpathy/char-rnn) model as an
+example, which trains over sentences or
+source code, with each character as an input unit. Particularly, we will train
+a RNN using GRU over Linux kernel source code. After training, we expect to
+generate meaningful code from the model.
+
+
+## Instructions
+
+* Compile and install SINGA. Currently the RNN implementation depends on Cudnn with version >= 5.05.
+
+* Prepare the dataset. Download the [kernel source code](http://cs.stanford.edu/people/karpathy/char-rnn/).
+Other plain text files can also be used.
+
+* Start the training,
+
+ python train.py linux_input.txt
+
+ Some hyper-parameters could be set through command line,
+
+ python train.py -h
+
+* Sample characters from the model by providing the number of characters to sample and the seed string.
+
+ python sample.py 'model.bin' 100 --seed '#include <std'
+
+ Please replace 'model.bin' with the path to one of the checkpoint paths.
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/cifar10/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/cifar10/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/cifar10/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/cifar10/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,94 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Train CNN over Cifar-10
+
+
+Convolution neural network (CNN) is a type of feed-forward artificial neural
+network widely used for image and video classification. In this example, we
+will train three deep CNN models to do image classification for the CIFAR-10 dataset,
+
+1. [AlexNet](https://code.google.com/p/cuda-convnet/source/browse/trunk/example-layers/layers-18pct.cfg)
+the best validation accuracy (without data augmentation) we achieved was about 82%.
+
+2. [VGGNet](http://torch.ch/blog/2015/07/30/cifar.html), the best validation accuracy (without data augmentation) we achieved was about 89%.
+3. [ResNet](https://github.com/facebook/fb.resnet.torch), the best validation accuracy (without data augmentation) we achieved was about 83%.
+4. [Alexnet from Caffe](https://github.com/BVLC/caffe/tree/master/examples/cifar10), SINGA is able to convert model from Caffe seamlessly.
+
+
+## Instructions
+
+
+### SINGA installation
+
+Users can compile and install SINGA from source or install the Python version.
+The code can ran on both CPU and GPU. For GPU training, CUDA and CUDNN (V4 or V5)
+are required. Please refer to the installation page for detailed instructions.
+
+### Data preparation
+
+The binary Cifar-10 dataset could be downloaded by
+
+ python download_data.py bin
+
+The Python version could be downloaded by
+
+ python download_data.py py
+
+### Training
+
+There are four training programs
+
+1. train.py. The following command would train the VGG model using the python
+version of the Cifar-10 dataset in 'cifar-10-batches-py' folder.
+
+ python train.py vgg cifar-10-batches-py
+
+ To train other models, please replace 'vgg' to 'alexnet', 'resnet' or 'caffe',
+ where 'caffe' refers to the alexnet model converted from Caffe. By default
+ the training would run on a CudaGPU device, to run it on CppCPU, add an additional
+ argument
+
+ python train.py vgg cifar-10-batches-py --use_cpu
+
+2. alexnet.cc. It trains the AlexNet model using the CPP APIs on a CudaGPU,
+
+ ./run.sh
+
+3. alexnet-parallel.cc. It trains the AlexNet model using the CPP APIs on two CudaGPU devices.
+The two devices run synchronously to compute the gradients of the mode parameters, which are
+averaged on the host CPU device and then be applied to update the parameters.
+
+ ./run-parallel.sh
+
+4. vgg-parallel.cc. It trains the VGG model using the CPP APIs on two CudaGPU devices similar to alexnet-parallel.cc.
+
+### Prediction
+
+predict.py includes the prediction function
+
+ def predict(net, images, dev, topk=5)
+
+The net is created by loading the previously trained model; Images consist of
+a numpy array of images (one row per image); dev is the training device, e.g.,
+a CudaGPU device or the host CppCPU device; It returns the topk labels for each instance.
+
+The predict.py file's main function provides an example of using the pre-trained alexnet model to do prediction for new images.
+The 'model.bin' file generated by the training program should be placed at the cifar10 folder to run
+
+ python predict.py
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/alexnet/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/alexnet/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/alexnet/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/alexnet/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,76 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Train AlexNet over ImageNet
+
+Convolution neural network (CNN) is a type of feed-forward neural
+network widely used for image and video classification. In this example, we will
+use a [deep CNN model](http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks)
+to do image classification against the ImageNet dataset.
+
+## Instructions
+
+### Compile SINGA
+
+Please compile SINGA with CUDA, CUDNN and OpenCV. You can manually turn on the
+options in CMakeLists.txt or run `ccmake ..` in build/ folder.
+
+We have tested CUDNN V4 and V5 (V5 requires CUDA 7.5)
+
+### Data download
+* Please refer to step1-3 on [Instructions to create ImageNet 2012 data](https://github.com/amd/OpenCL-caffe/wiki/Instructions-to-create-ImageNet-2012-data)
+ to download and decompress the data.
+* You can download the training and validation list by
+ [get_ilsvrc_aux.sh](https://github.com/BVLC/caffe/blob/master/data/ilsvrc12/get_ilsvrc_aux.sh)
+ or from [Imagenet](http://www.image-net.org/download-images).
+
+### Data preprocessing
+* Assuming you have downloaded the data and the list.
+ Now we should transform the data into binary files. You can run:
+
+ sh create_data.sh
+
+ The script will generate a test file(`test.bin`), a mean file(`mean.bin`) and
+ several training files(`trainX.bin`) in the specified output folder.
+* You can also change the parameters in `create_data.sh`.
+ + `-trainlist <file>`: the file of training list;
+ + `-trainfolder <folder>`: the folder of training images;
+ + `-testlist <file>`: the file of test list;
+ + `-testfolder <floder>`: the folder of test images;
+ + `-outdata <folder>`: the folder to save output files, including mean, training and test files.
+ The script will generate these files in the specified folder;
+ + `-filesize <int>`: number of training images that stores in each binary file.
+
+### Training
+* After preparing data, you can run the following command to train the Alexnet model.
+
+ sh run.sh
+
+* You may change the parameters in `run.sh`.
+ + `-epoch <int>`: number of epoch to be trained, default is 90;
+ + `-lr <float>`: base learning rate, the learning rate will decrease each 20 epochs,
+ more specifically, `lr = lr * exp(0.1 * (epoch / 20))`;
+ + `-batchsize <int>`: batchsize, it should be changed regarding to your memory;
+ + `-filesize <int>`: number of training images that stores in each binary file, it is the
+ same as the `filesize` in data preprocessing;
+ + `-ntrain <int>`: number of training images;
+ + `-ntest <int>`: number of test images;
+ + `-data <folder>`: the folder which stores the binary files, it is exactly the output
+ folder in data preprocessing step;
+ + `-pfreq <int>`: the frequency(in batch) of printing current model status(loss and accuracy);
+ + `-nthreads <int>`: the number of threads to load data which feed to the model.
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/densenet/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/densenet/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/densenet/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/densenet/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,68 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+---
+name: DenseNet models on ImageNet
+SINGA version: 1.1.1
+SINGA commit:
+license: https://github.com/pytorch/vision/blob/master/torchvision/models/densenet.py
+---
+
+# Image Classification using DenseNet
+
+
+In this example, we convert DenseNet on [PyTorch](https://github.com/pytorch/vision/blob/master/torchvision/models/densenet.py)
+to SINGA for image classification.
+
+## Instructions
+
+* Download one parameter checkpoint file (see below) and the synset word file of ImageNet into this folder, e.g.,
+
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/densenet/densenet-121.tar.gz
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/synset_words.txt
+ $ tar xvf densenet-121.tar.gz
+
+* Usage
+
+ $ python serve.py -h
+
+* Example
+
+ # use cpu
+ $ python serve.py --use_cpu --parameter_file densenet-121.pickle --depth 121 &
+ # use gpu
+ $ python serve.py --parameter_file densenet-121.pickle --depth 121 &
+
+ The parameter files for the following model and depth configuration pairs are provided:
+ [121](https://s3-ap-southeast-1.amazonaws.com/dlfile/densenet/densenet-121.tar.gz), [169](https://s3-ap-southeast-1.amazonaws.com/dlfile/densenet/densenet-169.tar.gz), [201](https://s3-ap-southeast-1.amazonaws.com/dlfile/densenet/densenet-201.tar.gz), [161](https://s3-ap-southeast-1.amazonaws.com/dlfile/densenet/densenet-161.tar.gz)
+
+* Submit images for classification
+
+ $ curl -i -F image=@image1.jpg http://localhost:9999/api
+ $ curl -i -F image=@image2.jpg http://localhost:9999/api
+ $ curl -i -F image=@image3.jpg http://localhost:9999/api
+
+image1.jpg, image2.jpg and image3.jpg should be downloaded before executing the above commands.
+
+## Details
+
+The parameter files were converted from the pytorch via the convert.py program.
+
+Usage:
+
+ $ python convert.py -h
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/googlenet/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/googlenet/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/googlenet/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/googlenet/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,84 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+---
+name: GoogleNet on ImageNet
+SINGA version: 1.0.1
+SINGA commit: 8c990f7da2de220e8a012c6a8ecc897dc7532744
+parameter_url: https://s3-ap-southeast-1.amazonaws.com/dlfile/bvlc_googlenet.tar.gz
+parameter_sha1: 0a88e8948b1abca3badfd8d090d6be03f8d7655d
+license: unrestricted https://github.com/BVLC/caffe/tree/master/models/bvlc_googlenet
+---
+
+# Image Classification using GoogleNet
+
+
+In this example, we convert GoogleNet trained on Caffe to SINGA for image classification.
+
+## Instructions
+
+* Download the parameter checkpoint file into this folder
+
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/bvlc_googlenet.tar.gz
+ $ tar xvf bvlc_googlenet.tar.gz
+
+* Run the program
+
+ # use cpu
+ $ python serve.py -C &
+ # use gpu
+ $ python serve.py &
+
+* Submit images for classification
+
+ $ curl -i -F image=@image1.jpg http://localhost:9999/api
+ $ curl -i -F image=@image2.jpg http://localhost:9999/api
+ $ curl -i -F image=@image3.jpg http://localhost:9999/api
+
+image1.jpg, image2.jpg and image3.jpg should be downloaded before executing the above commands.
+
+## Details
+
+We first extract the parameter values from [Caffe's checkpoint file](http://dl.caffe.berkeleyvision.org/bvlc_googlenet.caffemodel) into a pickle version
+After downloading the checkpoint file into `caffe_root/python` folder, run the following script
+
+ # to be executed within caffe_root/python folder
+ import caffe
+ import numpy as np
+ import cPickle as pickle
+
+ model_def = '../models/bvlc_googlenet/deploy.prototxt'
+ weight = 'bvlc_googlenet.caffemodel' # must be downloaded at first
+ net = caffe.Net(model_def, weight, caffe.TEST)
+
+ params = {}
+ for layer_name in net.params.keys():
+ weights=np.copy(net.params[layer_name][0].data)
+ bias=np.copy(net.params[layer_name][1].data)
+ params[layer_name+'_weight']=weights
+ params[layer_name+'_bias']=bias
+ print layer_name, weights.shape, bias.shape
+
+ with open('bvlc_googlenet.pickle', 'wb') as fd:
+ pickle.dump(params, fd)
+
+Then we construct the GoogleNet using SINGA's FeedForwardNet structure.
+Note that we added a EndPadding layer to resolve the issue from discrepancy
+of the rounding strategy of the pooling layer between Caffe (ceil) and cuDNN (floor).
+Only the MaxPooling layers outside inception blocks have this problem.
+Refer to [this](http://joelouismarino.github.io/blog_posts/blog_googlenet_keras.html) for more detials.
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/inception/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/inception/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/inception/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/inception/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,61 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+---
+name: Inception V4 on ImageNet
+SINGA version: 1.1.1
+SINGA commit:
+parameter_url: https://s3-ap-southeast-1.amazonaws.com/dlfile/inception_v4.tar.gz
+parameter_sha1: 5fdd6f5d8af8fd10e7321d9b38bb87ef14e80d56
+license: https://github.com/tensorflow/models/tree/master/slim
+---
+
+# Image Classification using Inception V4
+
+In this example, we convert Inception V4 trained on Tensorflow to SINGA for image classification.
+
+## Instructions
+
+* Download the parameter checkpoint file
+
+ $ wget
+ $ tar xvf inception_v4.tar.gz
+
+* Download [synset_word.txt](https://github.com/BVLC/caffe/blob/master/data/ilsvrc12/get_ilsvrc_aux.sh) file.
+
+* Run the program
+
+ # use cpu
+ $ python serve.py -C &
+ # use gpu
+ $ python serve.py &
+
+* Submit images for classification
+
+ $ curl -i -F image=@image1.jpg http://localhost:9999/api
+ $ curl -i -F image=@image2.jpg http://localhost:9999/api
+ $ curl -i -F image=@image3.jpg http://localhost:9999/api
+
+image1.jpg, image2.jpg and image3.jpg should be downloaded before executing the above commands.
+
+## Details
+
+We first extract the parameter values from [Tensorflow's checkpoint file](http://download.tensorflow.org/models/inception_v4_2016_09_09.tar.gz) into a pickle version.
+After downloading and decompressing the checkpoint file, run the following script
+
+ $ python convert.py --file_name=inception_v4.ckpt
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/resnet/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/resnet/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/resnet/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/resnet/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,72 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+---
+name: Resnets on ImageNet
+SINGA version: 1.1
+SINGA commit: 45ec92d8ffc1fa1385a9307fdf07e21da939ee2f
+parameter_url: https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-18.tar.gz
+license: Apache V2, https://github.com/facebook/fb.resnet.torch/blob/master/LICENSE
+---
+
+# Image Classification using Residual Networks
+
+
+In this example, we convert Residual Networks trained on [Torch](https://github.com/facebook/fb.resnet.torch) to SINGA for image classification.
+
+## Instructions
+
+* Download one parameter checkpoint file (see below) and the synset word file of ImageNet into this folder, e.g.,
+
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-18.tar.gz
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/synset_words.txt
+ $ tar xvf resnet-18.tar.gz
+
+* Usage
+
+ $ python serve.py -h
+
+* Example
+
+ # use cpu
+ $ python serve.py --use_cpu --parameter_file resnet-18.pickle --model resnet --depth 18 &
+ # use gpu
+ $ python serve.py --parameter_file resnet-18.pickle --model resnet --depth 18 &
+
+ The parameter files for the following model and depth configuration pairs are provided:
+ * resnet (original resnet), [18](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-18.tar.gz)|[34](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-34.tar.gz)|[101](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-101.tar.gz)|[152](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-152.tar.gz)
+ * addbn (resnet with a batch normalization layer after the addition), [50](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-50.tar.gz)
+ * wrn (wide resnet), [50](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/wrn-50-2.tar.gz)
+ * preact (resnet with pre-activation) [200](https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/resnet-200.tar.gz)
+
+* Submit images for classification
+
+ $ curl -i -F image=@image1.jpg http://localhost:9999/api
+ $ curl -i -F image=@image2.jpg http://localhost:9999/api
+ $ curl -i -F image=@image3.jpg http://localhost:9999/api
+
+image1.jpg, image2.jpg and image3.jpg should be downloaded before executing the above commands.
+
+## Details
+
+The parameter files were extracted from the original [torch files](https://github.com/facebook/fb.resnet.torch/tree/master/pretrained) via
+the convert.py program.
+
+Usage:
+
+ $ python convert.py -h
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/vgg/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/vgg/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/vgg/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/imagenet/vgg/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,69 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+---
+name: VGG models on ImageNet
+SINGA version: 1.1.1
+SINGA commit:
+license: https://github.com/pytorch/vision/blob/master/torchvision/models/vgg.py
+---
+
+# Image Classification using VGG
+
+
+In this example, we convert VGG on [PyTorch](https://github.com/pytorch/vision/blob/master/torchvision/models/vgg.py)
+to SINGA for image classification.
+
+## Instructions
+
+* Download one parameter checkpoint file (see below) and the synset word file of ImageNet into this folder, e.g.,
+
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg11.tar.gz
+ $ wget https://s3-ap-southeast-1.amazonaws.com/dlfile/resnet/synset_words.txt
+ $ tar xvf vgg11.tar.gz
+
+* Usage
+
+ $ python serve.py -h
+
+* Example
+
+ # use cpu
+ $ python serve.py --use_cpu --parameter_file vgg11.pickle --depth 11 &
+ # use gpu
+ $ python serve.py --parameter_file vgg11.pickle --depth 11 &
+
+ The parameter files for the following model and depth configuration pairs are provided:
+ * Without batch-normalization, [11](https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg11.tar.gz), [13](https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg13.tar.gz), [16](https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg16.tar.gz), [19](https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg19.tar.gz)
+ * With batch-normalization, [11](https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg11_bn.tar.gz), [13](https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg13_bn.tar.gz), [16](https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg16_bn.tar.gz), [19](https://s3-ap-southeast-1.amazonaws.com/dlfile/vgg/vgg19_bn.tar.gz)
+
+* Submit images for classification
+
+ $ curl -i -F image=@image1.jpg http://localhost:9999/api
+ $ curl -i -F image=@image2.jpg http://localhost:9999/api
+ $ curl -i -F image=@image3.jpg http://localhost:9999/api
+
+image1.jpg, image2.jpg and image3.jpg should be downloaded before executing the above commands.
+
+## Details
+
+The parameter files were converted from the pytorch via the convert.py program.
+
+Usage:
+
+ $ python convert.py -h
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/index.rst.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/index.rst.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/index.rst.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/index.rst.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,33 @@
+..
+.. Licensed to the Apache Software Foundation (ASF) under one
+.. or more contributor license agreements. See the NOTICE file
+.. distributed with this work for additional information
+.. regarding copyright ownership. The ASF licenses this file
+.. to you under the Apache License, Version 2.0 (the
+.. "License"); you may not use this file except in compliance
+.. with the License. You may obtain a copy of the License at
+..
+.. http://www.apache.org/licenses/LICENSE-2.0
+..
+.. Unless required by applicable law or agreed to in writing, software
+.. distributed under the License is distributed on an "AS IS" BASIS,
+.. WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+.. See the License for the specific language governing permissions and
+.. limitations under the License.
+..
+
+Model Zoo
+=========
+
+.. toctree::
+
+ cifar10/README
+ char-rnn/README
+ mnist/README
+ imagenet/alexnet/README
+ imagenet/densenet/README
+ imagenet/googlenet/README
+ imagenet/inception/README
+ imagenet/resnet/README
+ imagenet/vgg/README
+
Added: incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/mnist/README.md.txt
URL: http://svn.apache.org/viewvc/incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/mnist/README.md.txt?rev=1858059&view=auto
==============================================================================
--- incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/mnist/README.md.txt (added)
+++ incubator/singa/site/trunk/en/_sources/docs/model_zoo/examples/mnist/README.md.txt Wed Apr 24 14:57:35 2019
@@ -0,0 +1,36 @@
+<!--
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements. See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership. The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License. You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied. See the License for the
+ specific language governing permissions and limitations
+ under the License.
+-->
+# Train a RBM model against MNIST dataset
+
+This example is to train an RBM model using the
+MNIST dataset. The RBM model and its hyper-parameters are set following
+[Hinton's paper](http://www.cs.toronto.edu/~hinton/science.pdf)
+
+## Running instructions
+
+1. Download the pre-processed [MNIST dataset](https://github.com/mnielsen/neural-networks-and-deep-learning/raw/master/data/mnist.pkl.gz)
+
+2. Start the training
+
+ python train.py mnist.pkl.gz
+
+By default the training code would run on CPU. To run it on a GPU card, please start
+the program with an additional argument
+
+ python train.py mnist.pkl.gz --use_gpu