You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2018/12/10 22:35:00 UTC

[jira] [Commented] (SPARK-25200) Allow setting HADOOP_CONF_DIR as a spark property

    [ https://issues.apache.org/jira/browse/SPARK-25200?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16715692#comment-16715692 ] 

ASF GitHub Bot commented on SPARK-25200:
----------------------------------------

vanzin commented on issue #22289: [SPARK-25200][YARN] Allow specifying HADOOP_CONF_DIR as spark property
URL: https://github.com/apache/spark/pull/22289#issuecomment-446000384
 
 
   @adambalogh if you're not planning to address the issues in this PR we should probably close it.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> Allow setting HADOOP_CONF_DIR as a spark property
> -------------------------------------------------
>
>                 Key: SPARK-25200
>                 URL: https://issues.apache.org/jira/browse/SPARK-25200
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.1
>            Reporter: Adam Balogh
>            Priority: Major
>
> When submitting applications to Yarn in cluster mode, using the InProcessLauncher, spark finds the cluster's configuration files based on the HADOOP_CONF_DIR environment variable. This does not make it possible to submit to more than one Yarn clusters concurrently using the InProcessLauncher.
> I think we should make it possible to define HADOOP_CONF_DIR as a spark property, so it can be different for each spark submission.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org