You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Norman He (JIRA)" <ji...@apache.org> on 2016/07/15 18:38:20 UTC

[jira] [Closed] (SPARK-7807) High-Availablity:: SparkHadoopUtil.scala should support hadoopConfiguration.addResource()

     [ https://issues.apache.org/jira/browse/SPARK-7807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Norman He closed SPARK-7807.
----------------------------

> High-Availablity:: SparkHadoopUtil.scala should support  hadoopConfiguration.addResource()
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-7807
>                 URL: https://issues.apache.org/jira/browse/SPARK-7807
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>         Environment: running spark against remote-hadoop HA cluster. Easy of use with spark.hadoop.url. prefix.
> 1) user can support sparkConf with prefix spark.hadoop.url. like spark.hadoop.url.core-site 
> and spark.hadoop.url.hdfs-site 
>            Reporter: Norman He
>            Priority: Trivial
>              Labels: easyfix
>
> line 97 : should below should be able to change to 
> conf.getAll.foreach { case (key, value) =>
>         if (key.startsWith("spark.hadoop.")) {
>           hadoopConf.set(key.substring("spark.hadoop.".length), value)
>         }
>       }
> ----------------new version-------------------------------
>       conf.getAll.foreach { case (key, value) =>
>         if (key.startsWith("spark.hadoop.")) {
>           if( key.startsWith("spark.hadoop.url.")) 
>                hadoopConf.addResource(new URL(value))
>           else
>               hadoopConf.set(key.substring("spark.hadoop.".length), value)
>         }
>       }



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org