You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/05/18 16:50:00 UTC

[jira] [Commented] (SPARK-7706) Allow setting YARN_CONF_DIR from spark argument

    [ https://issues.apache.org/jira/browse/SPARK-7706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14548106#comment-14548106 ] 

Sean Owen commented on SPARK-7706:
----------------------------------

Can't you just use {{YARN_CONF_DIR=... command ...}}?

> Allow setting YARN_CONF_DIR from spark argument
> -----------------------------------------------
>
>                 Key: SPARK-7706
>                 URL: https://issues.apache.org/jira/browse/SPARK-7706
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit
>    Affects Versions: 1.3.1
>            Reporter: Shaik Idris Ali
>              Labels: oozie, yarn
>
> Currently in SparkSubmitArguments.scala when master is set to "yarn" (yarn-cluster mode)
> https://github.com/apache/spark/blob/b1f4ca82d170935d15f1fe6beb9af0743b4d81cd/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala#L236
> Spark checks if YARN_CONF_DIR or HADOOP_CONF_DIR is set in EVN.
> However we should additionally allow passing YARN_CONF_DIR from command line argument this is particularly handy when Spark is being launched from schedulers like OOZIE or FALCON.
> Reason being, oozie launcher App starts in one of the container assigned by Yarn RM and we do not want to set YARN_CONF_DIR in ENV for all the nodes in cluster. Just passing the argument like -yarnconfdir with conf dir (ex: /etc/hadoop/conf) should avoid setting the ENV variable.
> This is blocking us to onboard spark from oozie or falcon. Thanks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org