You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/05/16 13:07:59 UTC

[jira] [Updated] (SPARK-7504) NullPointerException when initializing SparkContext in YARN-cluster mode

     [ https://issues.apache.org/jira/browse/SPARK-7504?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-7504:
-----------------------------
    Assignee: Zoltán Zvara

> NullPointerException when initializing SparkContext in YARN-cluster mode
> ------------------------------------------------------------------------
>
>                 Key: SPARK-7504
>                 URL: https://issues.apache.org/jira/browse/SPARK-7504
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy, YARN
>            Reporter: Zoltán Zvara
>            Assignee: Zoltán Zvara
>              Labels: deployment, yarn, yarn-client
>             Fix For: 1.4.0
>
>
> It is not clear for most users that, while running Spark on YARN a {{SparkContext}} with a given execution plan can be run locally as {{yarn-client}}, but can not deploy itself to the cluster. This is currently performed using {{org.apache.spark.deploy.yarn.Client}}. {color:gray} I think we should support deployment through {{SparkContext}}, but this is not the point I wish to make here. {color}
> Configuring a {{SparkContext}} to deploy itself currently will yield an {{ERROR}} while accessing {{spark.yarn.app.id}} in  {{YarnClusterSchedulerBackend}}, and after that a {{NullPointerException}} while referencing the {{ApplicationMaster}} instance.
> Spark should clearly inform the user that it might be running in {{yarn-cluster}} mode without a proper submission using {{Client}} and that deploying is not supported directly from {{SparkContext}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org