You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kudu.apache.org by gr...@apache.org on 2019/01/10 23:05:22 UTC
[kudu] 06/09: [examples] fix name of the class for spark-submit
This is an automated email from the ASF dual-hosted git repository.
granthenke pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/kudu.git
commit 0fcb8890c64e948099a2bd156452ecfe15376b5f
Author: Alexey Serbin <al...@apache.org>
AuthorDate: Wed Jan 9 17:01:01 2019 -0800
[examples] fix name of the class for spark-submit
I also updated the description of master RPC endpoints and
removed the redundant default port number for masters' RPC
endpoints in the example.
Change-Id: Ic36a6fb62bd55667ea9f11c0124b5388f0e5bc7b
Reviewed-on: http://gerrit.cloudera.org:8080/12206
Tested-by: Kudu Jenkins
Reviewed-by: Grant Henke <gr...@apache.org>
Reviewed-by: Hao Hao <ha...@cloudera.com>
---
examples/scala/spark-example/README.adoc | 18 +++++++++++-------
1 file changed, 11 insertions(+), 7 deletions(-)
diff --git a/examples/scala/spark-example/README.adoc b/examples/scala/spark-example/README.adoc
index 8969325..f5c4e36 100644
--- a/examples/scala/spark-example/README.adoc
+++ b/examples/scala/spark-example/README.adoc
@@ -41,21 +41,25 @@ $ mvn package
To configure the kudu-spark example, there are two Java system properties
available:
-- kuduMasters: A comma-separated list of Kudu master addresses.
+- kuduMasters: A comma-separated list of Kudu master RPC endpoints, where
+ each endpoint is in form '<HostName|IPAddress>[:PortNumber]' (the port number
+ by default is 7051 if not specified).
Default: 'localhost:7051'.
- tableName: The name of the table to use for the example program. This
table should not exist in Kudu. Default: 'spark_test'.
The application can be run using `spark-submit`. For example, to run the
-example against a Spark cluster running on YARN, use a command like the
-following:
+example against a Spark cluster running on YARN with Kudu masters at nodes
+master1, master2, master3, use a command like the following:
[source.bash]
----
-$ spark-submit --class org.apache.kudu.examples.SparkExample --master yarn \
---driver-java-options \
-'-DkuduMasters=master1:7051,master2:7051,master3:7051 -DtableName=test_table' \
-target/kudu-spark-example-1.0-SNAPSHOT.jar
+$ spark-submit \
+ --class org.apache.kudu.spark.examples.SparkExample \
+ --master yarn \
+ --driver-java-options \
+ '-DkuduMasters=master1,master2,master3 -DtableName=test_table' \
+ target/kudu-spark-example-1.0-SNAPSHOT.jar
----
You will need the Kudu cluster to be up and running and Spark correctly