You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/05/15 15:58:04 UTC

[jira] [Commented] (SPARK-19707) Improve the invalid path check for sc.addJar

    [ https://issues.apache.org/jira/browse/SPARK-19707?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16010747#comment-16010747 ] 

Apache Spark commented on SPARK-19707:
--------------------------------------

User 'HyukjinKwon' has created a pull request for this issue:
https://github.com/apache/spark/pull/17987

> Improve the invalid path check for sc.addJar
> --------------------------------------------
>
>                 Key: SPARK-19707
>                 URL: https://issues.apache.org/jira/browse/SPARK-19707
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Saisai Shao
>            Assignee: Saisai Shao
>             Fix For: 2.1.1, 2.2.0
>
>
> Currently in Spark there're two issues when we add jars with invalid path:
> * If the jar path is a empty string {--jar ",dummy.jar"}, then Spark will resolve it to the current directory path and add to classpath / file server, which is unwanted.
> * If the jar path is a invalid path (file doesn't exist), file server doesn't check this and will still added file server, the exception will be thrown until job is running. This local path could be checked immediately, no need to wait until task running. We have similar check in {{addFile}}, but lacks similar one in {{addJar}}.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org