You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Makoto Hashimoto <to...@gmail.com> on 2018/05/22 07:45:33 UTC

Encounter 'Could not find or load main class' error when submitting spark job on kubernetes

Hi,

I am trying to run spark job on kubernetes. Using local spark job works

fine as follows:
$ ./bin/spark-submit --class org.apache.spark.examples.SparkPi
--master local[4] examples/jars/spark-examples_2.11-2.3.0.jar 100
......
2018-05-20 21:49:02 INFO  DAGScheduler:54 - Job 0 finished: reduce at
SparkPi.scala:38, took 2.459637 s
Pi is roughly 3.1418607141860715
2018-05-20 21:49:02 INFO  AbstractConnector:318 - Stopped
Spark@41bb8c78{HTTP/1.1,[http/1.1]}{localhost:4040}
2018-05-20 21:49:02 INFO  SparkUI:54 - Stopped Spark web UI at
http://localhost:4040
2018-05-20 21:49:02 INFO  MapOutputTrackerMasterEndpoint:54 -
MapOutputTrackerMasterEndpoint stopped!
2018-05-20 21:49:02 INFO  MemoryStore:54 - MemoryStore cleared
2018-05-20 21:49:02 INFO  BlockManager:54 - BlockManager stopped
2018-05-20 21:49:02 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2018-05-20 21:49:02 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 -
OutputCommitCoordinator stopped!
2018-05-20 21:49:02 INFO  SparkContext:54 - Successfully stopped SparkContext
2018-05-20 21:49:02 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-05-20 21:49:02 INFO  ShutdownHookManager:54 - Deleting directory
/tmp/spark-ad68e56c-7991-4c6c-b3c5-99ab481a1449
2018-05-20 21:49:02 INFO  ShutdownHookManager:54 - Deleting directory
/tmp/spark-bbcce77f-70a4-4ec1-ad05-e8819fd3ba7a

When I submitted spark job on kubernetes, it ended with error.

$ bin/spark-submit --master k8s://https://192.168.99.100:8443
--deploy-mode cluster --name spark-pi --class
org.apache.spark.examples.SparkPi --conf spark.executor.instances=5
 --conf spark.kubernetes.container.image=tokoma1/spark:1.0 --conf
spark.kubernetes.driver.pod.name=spark-pi-driver
local:///usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar
100
...
     Container name: spark-kubernetes-driver
     Container image: tokoma1/spark:1.0
     Container state: Terminated
     Exit code: 1
2018-05-20 21:59:02 INFO  Client:54 - Application spark-pi finished.
2018-05-20 21:59:02 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-05-20 21:59:02 INFO  ShutdownHookManager:54 - Deleting directory
/tmp/spark-485f73a5-7416-4caa-acb2-49b0bde5eb80

I checked the status of the pod as follows:

$ kubectl get pods

NAME              READY     STATUS    RESTARTS   AGE
spark-pi-driver   0/1       Error     0          1m

This means it ended with an error.

I checked log.

$ kubectl -n=default logs -f spark-pi-driver++ id -u
+ myuid=0
++ id -g
+ mygid=0
++ getent passwd 0
+ uidentry=root:x:0:0:root:/root:/bin/ash
+ '[' -z root:x:0:0:root:/root:/bin/ash ']'
+ SPARK_K8S_CMD=driver
+ '[' -z driver ']'
+ shift 1
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sed 's/[^=]*=\(.*\)/\1/g'
+ readarray -t SPARK_JAVA_OPTS
+ '[' -n /usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar:/usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar
']'
+ SPARK_CLASSPATH=':/opt/spark/jars/*:/usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar:/usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar'
+ '[' -n '' ']'
+ case "$SPARK_K8S_CMD" in
+ CMD=(${JAVA_HOME}/bin/java "${SPARK_JAVA_OPTS[@]}" -cp
"$SPARK_CLASSPATH" -Xms$SPARK_DRIVER_MEMORY -Xmx$SPARK_DRIVER_MEMORY
-Dspark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS
$SPARK_DRIVER_CLASS $SPARK_DRIVER_ARGS)
+ exec /sbin/tini -s -- /usr/lib/jvm/java-1.8-openjdk/bin/java
-Dspark.driver.port=7078
-Dspark.master=k8s://https://192.168.99.100:8443
-Dspark.kubernetes.driver.pod.name=spark-pi-driver
-Dspark.driver.blockManager.port=7079
-Dspark.kubernetes.container.image=tokoma1/spark:1.0
-Dspark.jars=/usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar,/usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar
-Dspark.app.name=spark-pi
-Dspark.app.id=spark-9762ba052680404a9220f451d99ba818
-Dspark.submit.deployMode=cluster -Dspark.executor.instances=5
-Dspark.kubernetes.executor.podNamePrefix=spark-pi-01f873a813323a4a85eb7a2464949141
-Dspark.driver.host=spark-pi-01f873a813323a4a85eb7a2464949141-driver-svc.default.svc
-cp ':/opt/spark/jars/*:/usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar:/usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar'
-Xms1g -Xmx1g -Dspark.driver.bindAddress=172.17.0.4
org.apache.spark.examples.SparkPi 100
Error: Could not find or load main class org.apache.spark.examples.SparkPi

Does anybody encountered the same error as I experienced and know the
resolution ?

Thanks,

Re: Encounter 'Could not find or load main class' error when submitting spark job on kubernetes

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Tue, May 22, 2018 at 12:45 AM, Makoto Hashimoto
<to...@gmail.com> wrote:
> local:///usr/local/oss/spark-2.3.0-bin-hadoop2.7/examples/jars/spark-examples_2.11-2.3.0.jar

Is that the path of the jar inside your docker image? The default
image puts that in /opt/spark IIRC.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org