You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Masiero Vanzin (Jira)" <ji...@apache.org> on 2019/11/16 00:19:00 UTC

[jira] [Resolved] (SPARK-29833) Add FileNotFoundException check for spark.yarn.jars

     [ https://issues.apache.org/jira/browse/SPARK-29833?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Masiero Vanzin resolved SPARK-29833.
--------------------------------------------
    Fix Version/s: 3.0.0
         Assignee: ulysses you
       Resolution: Fixed

> Add FileNotFoundException check  for spark.yarn.jars
> ----------------------------------------------------
>
>                 Key: SPARK-29833
>                 URL: https://issues.apache.org/jira/browse/SPARK-29833
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 2.4.4
>            Reporter: ulysses you
>            Assignee: ulysses you
>            Priority: Minor
>             Fix For: 3.0.0
>
>
> When set `spark.yarn.jars=/xxx/xxx` which is just a no schema path, spark will throw a NullPointerException.
> The reason is hdfs will return null if pathFs.globStatus(path) is not exist, and spark just use `pathFs.globStatus(path).filter(_.isFile())` without check it.
> Related Globber code is here
> {noformat}
>     /*
>      * When the input pattern "looks" like just a simple filename, and we
>      * can't find it, we return null rather than an empty array.
>      * This is a special case which the shell relies on.
>      *
>      * To be more precise: if there were no results, AND there were no
>      * groupings (aka brackets), and no wildcards in the input (aka stars),
>      * we return null.
>      */
>     if ((!sawWildcard) && results.isEmpty() &&
>         (flattenedPatterns.size() <= 1)) {
>       return null;
>     }
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org