You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2018/11/09 00:01:29 UTC

[GitHub] nswamy closed pull request #13093: Improve cpp-package example build files.

nswamy closed pull request #13093: Improve cpp-package example build files.
URL: https://github.com/apache/incubator-mxnet/pull/13093
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/cpp-package/example/Makefile b/cpp-package/example/Makefile
index eb0676cedf5..6b64469b5d2 100644
--- a/cpp-package/example/Makefile
+++ b/cpp-package/example/Makefile
@@ -16,6 +16,7 @@
 # under the License.
 
 prebuild :
+	@mkdir -p build
 	$(shell ./get_data.sh)
 	$(shell cp -r ../../lib ./)
 CPPEX_SRC = $(wildcard *.cpp)
@@ -38,8 +39,9 @@ debug: CPPEX_CFLAGS += -DDEBUG -g
 debug: prebuild all
 
 
+
 $(CPPEX_EXE):% : %.cpp
-	$(CXX) -std=c++0x $(CFLAGS)  $(CPPEX_CFLAGS) -o $@ $(filter %.cpp %.a, $^) $(CPPEX_EXTRA_LDFLAGS)
+	$(CXX) -std=c++0x $(CFLAGS)  $(CPPEX_CFLAGS) -o build/$@ $(filter %.cpp %.a, $^) $(CPPEX_EXTRA_LDFLAGS)
 
 clean:
-	rm -f $(CPPEX_EXE)
+	@rm -rf build
diff --git a/cpp-package/example/README.md b/cpp-package/example/README.md
index 5d2f3b01f8f..64f604469a7 100644
--- a/cpp-package/example/README.md
+++ b/cpp-package/example/README.md
@@ -27,7 +27,7 @@ This directory contains following examples. In order to run the examples, ensure
 The example implements the C++ version of AlexNet. The networks trains on MNIST data. The number of epochs can be specified as a command line argument. For example to train with 10 epochs use the following:
 
 	```
-	./alexnet 10
+	build/alexnet 10
 	```
 
 ### [googlenet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/googlenet.cpp>)
@@ -35,7 +35,7 @@ The example implements the C++ version of AlexNet. The networks trains on MNIST
 The code implements a GoogLeNet/Inception network using the C++ API. The example uses MNIST data to train the network. By default, the example trains the model for 100 epochs. The number of epochs can also be specified in the command line. For example, to train the model for 10 epochs use the following:
 
 ```
-./googlenet 10
+build/googlenet 10
 ```
 
 ### [mlp.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp.cpp>)
@@ -44,7 +44,7 @@ The code implements a multilayer perceptron from scratch. The example creates it
 To run the example use the following command:
 
 ```
-./mlp
+build/mlp
 ```
 
 ### [mlp_cpu.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_cpu.cpp>)
@@ -53,7 +53,7 @@ The code implements a multilayer perceptron to train the MNIST data. The code de
 To run the example use the following command:
 
 ```
-./mlp_cpu
+build/mlp_cpu
 ```
 
 ### [mlp_gpu.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_gpu.cpp>)
@@ -61,7 +61,7 @@ To run the example use the following command:
 The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind"  C++ API and MNISTIter. The example is designed to work on GPU. The example does not require command line arguments. To run the example execute following command:
 
 ```
-./mlp_gpu
+build/mlp_gpu
 ```
 
 ### [mlp_csv.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_csv.cpp>)
@@ -69,7 +69,7 @@ The code implements a multilayer perceptron to train the MNIST data. The code de
 The code implements a multilayer perceptron to train the MNIST data. The code demonstrates the use of the "SimpleBind"  C++ API and CSVIter. The CSVIter can iterate data that is in CSV format. The example can be run on CPU or GPU. The example usage is as follows:
 
 ```
-mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]"
+build/mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --batch_size 100 --hidden_units "128,64,64 [--gpu]"
 ```
 
 ### [resnet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/resnet.cpp>)
@@ -77,7 +77,7 @@ mlp_csv --train mnist_training_set.csv --test mnist_test_set.csv --epochs 10 --b
 The code implements a resnet model using the C++ API. The model is used to train MNIST data. The number of epochs for training the model can be specified on the command line. By default, model is trained for 100 epochs. For example, to train with 10 epochs use the following command:
 
 ```
-./resnet 10
+build/resnet 10
 ```
 
 ### [lenet.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/lenet.cpp>)
@@ -85,14 +85,14 @@ The code implements a resnet model using the C++ API. The model is used to train
 The code implements a lenet model using the C++ API. It uses MNIST training data in CSV format to train the network. The example does not use built-in CSVIter to read the data from CSV file. The number of epochs can be specified on the command line. By default, the mode is trained for 100,000 epochs. For example, to train with 10 epochs use the following command:
 
 ```
-./lenet 10
+build/lenet 10
 ```
 ### [lenet\_with\_mxdataiter.cpp](<https://github.com/apache/incubator-mxnet/blob/master/cpp-package/example/mlp_cpu.cpp>)
 
 The code implements a lenet model using the C++ API. It uses MNIST training data to train the network. The example uses built-in MNISTIter to read the data. The number of epochs can be specified on the command line. By default, the mode is trained for 100 epochs. For example, to train with 10 epochs use the following command:
 
 ```
-./lenet\_with\_mxdataiter 10
+build/lenet_with_mxdataiter 10
 ```
 
 In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist data and run `lenet_with_mxdataiter` example.
@@ -102,5 +102,5 @@ In addition, there is `run_lenet_with_mxdataiter.sh` that downloads the mnist da
 The code implements an Inception network using the C++ API with batch normalization. The example uses MNIST data to train the network. The model trains for 100 epochs. The example can be run by executing the following command:
 
 ```
-./inception_bn
+build/inception_bn
 ```
diff --git a/cpp-package/example/example.mk b/cpp-package/example/example.mk
index 4914b31ba84..cb8fec1da06 100644
--- a/cpp-package/example/example.mk
+++ b/cpp-package/example/example.mk
@@ -18,7 +18,7 @@
 CPPEX_SRC = $(wildcard cpp-package/example/*.cpp)
 CPPEX_EXE = $(patsubst cpp-package/example/%.cpp, build/cpp-package/example/%, $(CPPEX_SRC))
 
-CPPEX_CFLAGS += -Icpp-package/include -Ibuild/cpp-package/include
+CPPEX_CFLAGS += -Icpp-package/include
 CPPEX_EXTRA_LDFLAGS := -L$(ROOTDIR)/lib -lmxnet
 
 EXTRA_PACKAGES += cpp-package-example-all


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services