You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/02/15 00:56:00 UTC

[jira] [Resolved] (SPARK-24599) SPARK_MOUNTED_CLASSPATH contains incorrect semicolon on Windows

     [ https://issues.apache.org/jira/browse/SPARK-24599?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-24599.
------------------------------------
    Resolution: Won't Fix

SPARK_MOUNTED_CLASSPATH doesn't exist anymore, so I'm assuming that the current way of starting the driver / executor avoids this issue.

> SPARK_MOUNTED_CLASSPATH contains incorrect semicolon on Windows
> ---------------------------------------------------------------
>
>                 Key: SPARK-24599
>                 URL: https://issues.apache.org/jira/browse/SPARK-24599
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes, Windows
>    Affects Versions: 2.3.0, 2.3.1
>            Reporter: Tobias Munk
>            Priority: Major
>
> When running spark-submit in cluster mode on kubernetes on a windows machine, the environment variable {{SPARK_MOUNTED_CLASSPATH}} does incorrectly contain a semicolon:
> {code:java}
> $ echo $SPARK_MOUNTED_CLASSPATH
> /opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar
> {code}
> When running spark-submit, the driver aborts:
> {code:java}
>  ./bin/spark-submit.cmd --master k8s://https://localhost:6445 --deploy-mode cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf spark.executor.insta
>  nces=1 --conf spark.kubernetes.container.image=spark:k8s-spark1 local:///opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar{code}
> {code:java}
> kubectl logs spark-pi-b12d0501f2fc309d89e8634937b7f52c-driver
> ++ id -u
> + myuid=0
> ++ id -g
> + mygid=0
> ++ getent passwd 0
> + uidentry=root:x:0:0:root:/root:/bin/ash
> + '[' -z root:x:0:0:root:/root:/bin/ash ']'
> + SPARK_K8S_CMD=driver
> + '[' -z driver ']'
> + shift 1
> + SPARK_CLASSPATH=':/opt/spark/jars/*'
> + env
> + grep SPARK_JAVA_OPT_
> + sed 's/[^=]*=\(.*\)/\1/g'
> + sort -t_ -k4 -n
> + readarray -t SPARK_JAVA_OPTS
> + '[' -n '/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar' ']'
> + SPARK_CLASSPATH=':/opt/spark/jars/*:/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar'
> + '[' -n '' ']'
> + case "$SPARK_K8S_CMD" in
> + CMD=(${JAVA_HOME}/bin/java "${SPARK_JAVA_OPTS[@]}" -cp "$SPARK_CLASSPATH" -Xms$SPARK_DRIVER_MEMORY -Xmx$SPARK_DRIVER_MEMORY -Dspark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS $SPARK_DRIVER_CLASS $SPARK_DRIVER_ARGS)
> + exec /sbin/tini -s -- /usr/lib/jvm/java-1.8-openjdk/bin/java -Dspark.kubernetes.driver.pod.name=spark-pi-b12d0501f2fc309d89e8634937b7f52c-driver -Dspark.jars=/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar,/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar -Dspark.app.name=spark-pi -Dspark.submit.deployMode=cluster -Dspark.driver.blockManager.port=7079 -Dspark.kubernetes.executor.podNamePrefix=spark-pi-b12d0501f2fc309d89e8634937b7f52c -Dspark.executor.instances=1 -Dspark.app.id=spark-65f2c8cc3ccf462694a67c18e947158c -Dspark.driver.port=7078 -Dspark.master=k8s://https://localhost:6445 -Dspark.kubernetes.container.image=spark:k8s-spark1 -Dspark.driver.host=spark-pi-b12d0501f2fc309d89e8634937b7f52c-driver-svc.default.svc -cp ':/opt/spark/jars/*:/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar;/opt/spark/examples/jars/spark-examples_2.11-2.3.1.jar' -Xms1g -Xmx1g -Dspark.driver.bindAddress=10.1.0.150 org.apache.spark.examples.SparkPi
> Error: Could not find or load main class org.apache.spark.examples.SparkPi{code}
> (Note the semicolon in the last part of the line SPARK_CLASSPATH=...)
> You can overwrite {{SPARK_MOUNTED_CLASSPATH}} in {{$SPARK_HOME/kubernetes/dockerfiles/spark/entrypoint.sh}} removing the part with the semicolon, and then rebuild the docker image with {{$SPARK_HOME/bin/docker-image-tool.sh}}. After that, spark-submit does succeed.
> See also SO: https://stackoverflow.com/questions/49728170/spark-submit-from-windows-vs-linux



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org