You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:16:39 UTC

[jira] [Resolved] (SPARK-24174) Expose Hadoop config as part of /environment API

     [ https://issues.apache.org/jira/browse/SPARK-24174?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-24174.
----------------------------------
    Resolution: Incomplete

> Expose Hadoop config as part of /environment API
> ------------------------------------------------
>
>                 Key: SPARK-24174
>                 URL: https://issues.apache.org/jira/browse/SPARK-24174
>             Project: Spark
>          Issue Type: Wish
>          Components: Spark Core
>    Affects Versions: 2.1.0
>            Reporter: Nikolay Sokolov
>            Priority: Minor
>              Labels: bulk-closed, features, usability
>
> Currently, UI or /environment API call of HistoryServer or WebUI exposes only system properties and SparkConf. However, in some cases when Spark is used in conjunction with Hadoop, it is useful to know Hadoop configuration properties. For example, HDFS or GS buffer sizes, hive metastore settings, and so on.
> So it would be good to have hadoop properties being exposed in /environment API, for example:
> {code:none}
> GET .../application_1525395994996_5/environment
> {
>    "runtime": {"javaVersion": "1.8.0_131 (Oracle Corporation)", ...}
>    "sparkProperties": ["java.io.tmpdir","/tmp", ...],
>    "systemProperties": [["spark.yarn.jars", "local:/usr/lib/spark/jars/*"], ...],
>    "classpathEntries": [["/usr/lib/hadoop/hadoop-annotations.jar","System Classpath"], ...],
>    "hadoopProperties": [["dfs.stream-buffer-size", 4096], ...],
> }
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org