You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Darabos (JIRA)" <ji...@apache.org> on 2014/12/11 16:43:13 UTC

[jira] [Created] (SPARK-4831) Current directory always on classpath with spark-submit

Daniel Darabos created SPARK-4831:
-------------------------------------

             Summary: Current directory always on classpath with spark-submit
                 Key: SPARK-4831
                 URL: https://issues.apache.org/jira/browse/SPARK-4831
             Project: Spark
          Issue Type: Bug
          Components: Deploy
    Affects Versions: 1.1.1, 1.2.0
            Reporter: Daniel Darabos
            Priority: Minor


We had a situation where we were launching an application with spark-submit, and a file (play.plugins) was on the classpath twice, causing problems (trying to register plugins twice). Upon investigating how it got on the classpath twice, we found that it was present in one of our jars, and also in the current working directory. But the one in the current working directory should not be on the classpath. We never asked spark-submit to put the current directory on the classpath.

I think this is caused by a line in [compute-classpath.sh|https://github.com/apache/spark/blob/v1.2.0-rc2/bin/compute-classpath.sh#L28]:

{code}
CLASSPATH="$SPARK_CLASSPATH:$SPARK_SUBMIT_CLASSPATH"
{code}

Now if SPARK_CLASSPATH is empty, the empty string is added to the classpath, which means the current working directory.

We tried setting SPARK_CLASSPATH to a bogus value, but that is [not allowed|https://github.com/apache/spark/blob/v1.2.0-rc2/core/src/main/scala/org/apache/spark/SparkConf.scala#L312].

What is the right solution? Only add SPARK_CLASSPATH if it's non-empty? I can send a pull request for that I think. Thanks!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org