You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Umar Javed <um...@gmail.com> on 2013/11/04 18:09:24 UTC

hadoop configuration

In 'SparkHadoopUtil.scala'
in /core/src/main/scala/org/apache/spark/deploy/, there is a method:

 def newConfiguration(): Configuration = new Configuration()

There is a header that imports Configuration : import
org.apache.hadoop.conf.Configuration

But I'm unable to find the definition of Configuration under
/core/src/main/scala/org/apache/hadoop/
The only subdirectories in this directory are mapred and mapreduce. Does
anybody know where 'Configuration' is defined?

Re: hadoop configuration

Posted by dachuan <hd...@gmail.com>.
I guess it's defined in Hadoop library. You can try to download the Hadoop
source code, or use some IDE to solve the dependency issue automatically, I
am using IntelliJ Idea community version.


On Mon, Nov 4, 2013 at 12:09 PM, Umar Javed <um...@gmail.com> wrote:

> In 'SparkHadoopUtil.scala'
> in /core/src/main/scala/org/apache/spark/deploy/, there is a method:
>
>  def newConfiguration(): Configuration = new Configuration()
>
> There is a header that imports Configuration : import
> org.apache.hadoop.conf.Configuration
>
> But I'm unable to find the definition of Configuration under
> /core/src/main/scala/org/apache/hadoop/
> The only subdirectories in this directory are mapred and mapreduce. Does
> anybody know where 'Configuration' is defined?
>



-- 
Dachuan Huang
Cellphone: 614-390-7234
2015 Neil Avenue
Ohio State University
Columbus, Ohio
U.S.A.
43210