You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2017/11/14 18:14:00 UTC

[jira] [Commented] (SPARK-22519) ApplicationMaster.cleanupStagingDir() throws NPE when SPARK_YARN_STAGING_DIR env var is not available

    [ https://issues.apache.org/jira/browse/SPARK-22519?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16251866#comment-16251866 ] 

Marcelo Vanzin commented on SPARK-22519:
----------------------------------------

Question is, when would it be null since it's set by Client.scala?

{code}
  private def setupLaunchEnv(
      stagingDirPath: Path,
      pySparkArchives: Seq[String]): HashMap[String, String] = {
    ...
    env("SPARK_YARN_STAGING_DIR") = stagingDirPath.toString
{code}

> ApplicationMaster.cleanupStagingDir() throws NPE when SPARK_YARN_STAGING_DIR env var is not available
> -----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-22519
>                 URL: https://issues.apache.org/jira/browse/SPARK-22519
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 2.2.0
>            Reporter: Devaraj K
>            Priority: Minor
>
> In the below, the condition checks whether the stagingDirPath is null but stagingDirPath never becomes null. If SPARK_YARN_STAGING_DIR env var is null then it throws NPE while creating the Path.
> {code:title=ApplicationMaster.scala|borderStyle=solid}
>         stagingDirPath = new Path(System.getenv("SPARK_YARN_STAGING_DIR"))
>         if (stagingDirPath == null) {
>           logError("Staging directory is null")
>           return
>         }
> {code}
> Here we need to check whether the System.getenv("SPARK_YARN_STAGING_DIR") is null or not, not the stagingDirPath.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org