You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Saisai Shao (JIRA)" <ji...@apache.org> on 2017/02/23 09:14:44 UTC

[jira] [Created] (SPARK-19707) Improve the invalid path handling for sc.addJar

Saisai Shao created SPARK-19707:
-----------------------------------

             Summary: Improve the invalid path handling for sc.addJar
                 Key: SPARK-19707
                 URL: https://issues.apache.org/jira/browse/SPARK-19707
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.0
            Reporter: Saisai Shao


Currently in Spark there're two issues when we add jars with invalid path:

* If the jar path is a empty string {--jar ",dummy.jar"}, then Spark will resolve it to the current directory path and add to classpath / file server, which is unwanted.
* If the jar path is a invalid path (file doesn't exist), file server doesn't check this and will still added file server, the exception will be thrown until job is running. This local path could be checked immediately, no need to wait until task running. We have similar check in {{addFile}}, but lacks similar one in {{addJar}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org