You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2018/01/04 18:13:00 UTC
[jira] [Created] (SPARK-22962) Kubernetes app fails if local files
are used
Marcelo Vanzin created SPARK-22962:
--------------------------------------
Summary: Kubernetes app fails if local files are used
Key: SPARK-22962
URL: https://issues.apache.org/jira/browse/SPARK-22962
Project: Spark
Issue Type: Improvement
Components: Kubernetes
Affects Versions: 2.3.0
Reporter: Marcelo Vanzin
If you try to start a Spark app on kubernetes using a local file as the app resource, for example, it will fail:
{code}
./bin/spark-submit [[bunch of arguments]] /path/to/local/file.jar
{code}
{noformat}
+ /sbin/tini -s -- /bin/sh -c 'SPARK_CLASSPATH="${SPARK_HOME}/jars/*" && env | grep SPARK_JAVA_OPT_ | sed '\''s/[^=]*=\(.*\)/\1/g'
\'' > /tmp/java_opts.txt && readarray -t SPARK_DRIVER_JAVA_OPTS < /tmp/java_opts.txt && if ! [ -z ${SPARK_MOUNTED_CLASSPATH+x}
]; then SPARK_CLASSPATH="$SPARK_MOUNTED_CLASSPATH:$SPARK_CLASSPATH"; fi && if ! [ -z ${SPARK_SUBMIT_EXTRA_CLASSPATH+x} ]; then SP
ARK_CLASSPATH="$SPARK_SUBMIT_EXTRA_CLASSPATH:$SPARK_CLASSPATH"; fi && if ! [ -z ${SPARK_MOUNTED_FILES_DIR+x} ]; then cp -R "$SPARK
_MOUNTED_FILES_DIR/." .; fi && ${JAVA_HOME}/bin/java "${SPARK_DRIVER_JAVA_OPTS[@]}" -cp "$SPARK_CLASSPATH" -Xms$SPARK_DRIVER_MEMOR
Y -Xmx$SPARK_DRIVER_MEMORY -Dspark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS $SPARK_DRIVER_CLASS $SPARK_DRIVER_ARGS'
Error: Could not find or load main class com.cloudera.spark.tests.Sleeper
{noformat}
Using an http server to provide the app jar solves the problem.
The k8s backend should either somehow make these files available to the cluster or error out with a more user-friendly message if that feature is not yet available.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org