You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by ns...@apache.org on 2018/11/09 00:01:50 UTC

[incubator-mxnet] branch master updated: Improve cpp-package example project build files. (#13093)

This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
     new 7f1d53e  Improve cpp-package example project build files. (#13093)
7f1d53e is described below

commit 7f1d53e6d023a65b2fe83364417a99012de83ea4
Author: Frank Liu <fr...@gmail.com>
AuthorDate: Thu Nov 8 16:01:27 2018 -0800

    Improve cpp-package example project build files. (#13093)
    
    1. Change output to build folder.
    2. Remove files that not been deleted by make clean.
---
 cpp-package/example/Makefile   |  6 ++++--
 cpp-package/example/README.md  | 20 ++++++++++----------
 cpp-package/example/example.mk |  2 +-
 3 files changed, 15 insertions(+), 13 deletions(-)

diff --git a/cpp-package/example/Makefile b/cpp-package/example/Makefile
index eb0676c..6b64469 100644
--- a/cpp-package/example/Makefile
+++ b/cpp-package/example/Makefile
@@ -16,6 +16,7 @@
 # under the License.
 
 prebuild :
+	@mkdir -p build
 	$(shell ./get_data.sh)
 	$(shell cp -r ../../lib ./)
 CPPEX_SRC = $(wildcard *.cpp)
@@ -38,8 +39,9 @@ debug: CPPEX_CFLAGS += -DDEBUG -g
 debug: prebuild all
 
 
+
 $(CPPEX_EXE):% : %.cpp
-	$(CXX) -std=c++0x $(CFLAGS)  $(CPPEX_CFLAGS) -o $@ $(filter %.cpp %.a, $^) $(CPPEX_EXTRA_LDFLAGS)
+	$(CXX) -std=c++0x $(CFLAGS)  $(CPPEX_CFLAGS) -o build/$@ $(filter %.cpp %.a, $^) $(CPPEX_EXTRA_LDFLAGS)
 
 clean:
-	rm -f $(CPPEX_EXE)
+	@rm -rf build
diff --git a/cpp-package/example/README.md b/cpp-package/example/README.md
index 5d2f3b0..64f6044 100644
--- a/cpp-package/example/README.md
+++ b/cpp-package/example/README.md
@@ -27,7 +27,7 @@ This directory contains following examples. In order to run the examples, ensure
 The example implements the C++ version of AlexNet. The networks trains on MNIST data. The number of epochs can be specified as a command line argument. For example to train with 10 epochs use the following:
 
 	```
-	./alexnet 10
+	build/alexnet 10
 	```
 
 ### [googlenet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/googlenet.cpp>)
@@ -35,7 +35,7 @@ The example implements the C++ version of AlexNet. The networks trains on MNIST
 The code implements a GoogLeNet/Inception network using the C++ API. The example uses MNIST data to train the network. By default, the example trains the model for 100 epochs. The number of epochs can also be specified in the command line. For example, to train the model for 10 epochs use the following:
 
 ```
-./googlenet 10
+build/googlenet 10
 ```
 
 ### [mlp.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp.cpp>)
@@ -44,7 +44,7 @@ The code implements a multilayer perceptron from scratch. The example creates it
 To run the example use the following command:
 
 ```
-./mlp
+build/mlp
 ```
 
 ### [mlp_cpu.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_cpu.cpp>)
@@ -53,7 +53,7 @@ The code implements a multilayer perceptron to train the MNIST data. The code de
 To run the example use the following command:
 
 ```
-./mlp_cpu
+build/mlp_cpu
 ```
 
 ### [mlp_gpu.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_gpu.cpp>)
@@ -61,7 +61,7 @@ To run the example use the following command:
 The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind"  C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line arguments. To run the example execute following command:
 
 ```
-./mlp_gpu
+build/mlp_gpu
 ```
 
 ### [mlp_csv.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_csv.cpp>)
@@ -69,7 +69,7 @@ The code implements a multilayer perceptron to train the MNIST data. The code de
 The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind"  C++ API and CSVIter. The CSVIter can iterate data that is in CSV format. The example can be run on CPU or GPU. The example usage is as follows:
 
 ```
-mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]"
+build/mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]"
 ```
 
 ### [resnet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/resnet.cpp>)
@@ -77,7 +77,7 @@ mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --b
 The code implements a resnet model using the C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. For example, to train with 10 epochs use the following command:
 
 ```
-./resnet 10
+build/resnet 10
 ```
 
 ### [lenet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/lenet.cpp>)
@@ -85,14 +85,14 @@ The code implements a resnet model using the C++ API. The model is used to train
 The code implements a lenet model using the C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100,000 epochs. For example, to train with 10 epochs use the following command:
 
 ```
-./lenet 10
+build/lenet 10
 ```
 ### [lenet\_with\_mxdataiter.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_cpu.cpp>)
 
 The code implements a lenet model using the C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. For example, to train with 10 epochs use the following command:
 
 ```
-./lenet\_with\_mxdataiter 10
+build/lenet_with_mxdataiter 10
 ```
 
 In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist data and run `lenet_with_mxdataiter` example.
@@ -102,5 +102,5 @@ In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist da
 The code implements an Inception network using the C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. The example can be run by executing the following command:
 
 ```
-./inception_bn
+build/inception_bn
 ```
diff --git a/cpp-package/example/example.mk b/cpp-package/example/example.mk
index 4914b31..cb8fec1 100644
--- a/cpp-package/example/example.mk
+++ b/cpp-package/example/example.mk
@@ -18,7 +18,7 @@
 CPPEX_SRC = $(wildcard cpp-package/example/*.cpp)
 CPPEX_EXE = $(patsubst cpp-package/example/%.cpp, build/cpp-package/example/%, $(CPPEX_SRC))
 
-CPPEX_CFLAGS += -Icpp-package/include -Ibuild/cpp-package/include
+CPPEX_CFLAGS += -Icpp-package/include
 CPPEX_EXTRA_LDFLAGS := -L$(ROOTDIR)/lib -lmxnet
 
 EXTRA_PACKAGES += cpp-package-example-all