You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ulysses you (Jira)" <ji...@apache.org> on 2019/11/11 04:42:00 UTC

[jira] [Updated] (SPARK-29833) Add FileNotFoundException check for spark.yarn.jars

     [ https://issues.apache.org/jira/browse/SPARK-29833?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

ulysses you updated SPARK-29833:
--------------------------------
    Summary: Add FileNotFoundException check  for spark.yarn.jars  (was: Add FileNotFoundException for spark.yarn.jars)

> Add FileNotFoundException check  for spark.yarn.jars
> ----------------------------------------------------
>
>                 Key: SPARK-29833
>                 URL: https://issues.apache.org/jira/browse/SPARK-29833
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 2.4.4
>            Reporter: ulysses you
>            Priority: Minor
>
> When set `spark.yarn.jars=/xxx/xxx` which is just a no schema path, spark will throw a NullPointerException.
> The reason is hdfs will return null if pathFs.globStatus(path) is not exist, and spark just use `pathFs.globStatus(path).filter(_.isFile())` without check it.
> Related Globber code is here
> {noformat}
>     /*
>      * When the input pattern "looks" like just a simple filename, and we
>      * can't find it, we return null rather than an empty array.
>      * This is a special case which the shell relies on.
>      *
>      * To be more precise: if there were no results, AND there were no
>      * groupings (aka brackets), and no wildcards in the input (aka stars),
>      * we return null.
>      */
>     if ((!sawWildcard) && results.isEmpty() &&
>         (flattenedPatterns.size() <= 1)) {
>       return null;
>     }
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org