You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Norman He (JIRA)" <ji...@apache.org> on 2015/05/22 01:30:17 UTC

[jira] [Created] (SPARK-7807) High-Availablity:: SparkHadoopUtil.scala should support hadoopConfiguration.addResource()

Norman He created SPARK-7807:
--------------------------------

             Summary: High-Availablity:: SparkHadoopUtil.scala should support  hadoopConfiguration.addResource()
                 Key: SPARK-7807
                 URL: https://issues.apache.org/jira/browse/SPARK-7807
             Project: Spark
          Issue Type: Bug
         Environment: running spark against remote-hadoop HA cluster. Easy of use with spark.hadoop.url. prefix.

1) user can support sparkConf with prefix spark.hadoop.url. like spark.hadoop.url.core-site 
and spark.hadoop.url.hdfs-site 

            Reporter: Norman He
            Priority: Trivial


line 97 : should below should be able to change to 
conf.getAll.foreach { case (key, value) =>
        if (key.startsWith("spark.hadoop.")) {
          hadoopConf.set(key.substring("spark.hadoop.".length), value)
        }
      }

----------------new version-------------------------------
      conf.getAll.foreach { case (key, value) =>
        if (key.startsWith("spark.hadoop.")) {
          if( key.startsWith("spark.hadoop.url.")) 
               hadoopConf.addResource(new URL(value))
          else
              hadoopConf.set(key.substring("spark.hadoop.".length), value)
        }
      }





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org