You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Prateek Jindal <ji...@illinois.edu> on 2010/02/05 23:58:51 UTC

Not able to compile '.java' files

 Hi everyone,

I am new to mapReduce. I am trying to run a very basic mapReduce
application. I encountered the following problem. Can someone help me about
it:

1) I have 3 files, namely MaxTemperature.java, MaxTemperatureMapper.java,
MaxTemperatureReducer.java. Now, I have to compile them to get the '.class'
files which would be used by 'hadoop' command. I tried the following:

'javac -cp .:/hadoop/lib MaxTemperatureMapper.java'

But it gives me the error that it doesn't recognize the packages '
org.apache.hadoop.io', 'org.apache.hadoop.mapred' and so on.

Can someone suggest something about that?

2) Also, do we have to make the '.class' files by ourselves necessarily. Or
is it somehow possible that hadoop will make .class files by itself (from
the .java source files)?

Thanks,
Prateek.

Re: Not able to compile '.java' files

Posted by Prateek Jindal <ji...@illinois.edu>.
Thanks for the help, Brien.

--Prateek.

On Fri, Feb 5, 2010 at 11:26 PM, brien colwell <xc...@gmail.com> wrote:

> To get a feel for Hadoop, I'd recommend using Eclipse and using a single
> node to start. If you add all the Hadoop JARs to your Eclipse build path, I
> think there are five, then Eclipse will manage the classpath for you.
>
> The following config settings will set up Hadoop to use the local file
> system and run your MapRed job in a single JVM. In this way you can set also
> breakpoints and take it step by step.
>
> Configuration baseConf = new Configuration();
> baseConf.set("mapred.job.tracker", "local");
>        baseConf.set("fs.default.name", "file:///");
>        baseConf.set("mapred.system.dir",
> String.format("%s/hadoop/mapred/system", LOCAL_TEMP_DIR));
>        baseConf.set("mapred.local.dir",
> String.format("%s/hadoop/mapred/data", LOCAL_TEMP_DIR));
>        baseConf.set("dfs.name.dir", String.format("%s/hadoop/namespace",
> LOCAL_TEMP_DIR));
>        baseConf.set("dfs.data.dir", String.format("%s/hadoop/data",
> LOCAL_TEMP_DIR));
>
>
> Then use this configuration when setting up a JobConf.
>
>
> hope that helps,
>
> Brien
>
>
>
>
> On 2/5/2010 5:58 PM, Prateek Jindal wrote:
>
>>  Hi everyone,
>>
>> I am new to mapReduce. I am trying to run a very basic mapReduce
>> application. I encountered the following problem. Can someone help me
>> about
>> it:
>>
>> 1) I have 3 files, namely MaxTemperature.java, MaxTemperatureMapper.java,
>> MaxTemperatureReducer.java. Now, I have to compile them to get the
>> '.class'
>> files which would be used by 'hadoop' command. I tried the following:
>>
>> 'javac -cp .:/hadoop/lib MaxTemperatureMapper.java'
>>
>> But it gives me the error that it doesn't recognize the packages '
>> org.apache.hadoop.io', 'org.apache.hadoop.mapred' and so on.
>>
>> Can someone suggest something about that?
>>
>> 2) Also, do we have to make the '.class' files by ourselves necessarily.
>> Or
>> is it somehow possible that hadoop will make .class files by itself (from
>> the .java source files)?
>>
>> Thanks,
>> Prateek.
>>
>>
>>
>
>

Re: Not able to compile '.java' files

Posted by brien colwell <xc...@gmail.com>.
To get a feel for Hadoop, I'd recommend using Eclipse and using a single 
node to start. If you add all the Hadoop JARs to your Eclipse build 
path, I think there are five, then Eclipse will manage the classpath for 
you.

The following config settings will set up Hadoop to use the local file 
system and run your MapRed job in a single JVM. In this way you can set 
also breakpoints and take it step by step.

Configuration baseConf = new Configuration();
baseConf.set("mapred.job.tracker", "local");
         baseConf.set("fs.default.name", "file:///");
         baseConf.set("mapred.system.dir", 
String.format("%s/hadoop/mapred/system", LOCAL_TEMP_DIR));
         baseConf.set("mapred.local.dir", 
String.format("%s/hadoop/mapred/data", LOCAL_TEMP_DIR));
         baseConf.set("dfs.name.dir", 
String.format("%s/hadoop/namespace", LOCAL_TEMP_DIR));
         baseConf.set("dfs.data.dir", String.format("%s/hadoop/data", 
LOCAL_TEMP_DIR));


Then use this configuration when setting up a JobConf.


hope that helps,

Brien



On 2/5/2010 5:58 PM, Prateek Jindal wrote:
>   Hi everyone,
>
> I am new to mapReduce. I am trying to run a very basic mapReduce
> application. I encountered the following problem. Can someone help me about
> it:
>
> 1) I have 3 files, namely MaxTemperature.java, MaxTemperatureMapper.java,
> MaxTemperatureReducer.java. Now, I have to compile them to get the '.class'
> files which would be used by 'hadoop' command. I tried the following:
>
> 'javac -cp .:/hadoop/lib MaxTemperatureMapper.java'
>
> But it gives me the error that it doesn't recognize the packages '
> org.apache.hadoop.io', 'org.apache.hadoop.mapred' and so on.
>
> Can someone suggest something about that?
>
> 2) Also, do we have to make the '.class' files by ourselves necessarily. Or
> is it somehow possible that hadoop will make .class files by itself (from
> the .java source files)?
>
> Thanks,
> Prateek.
>
>    


Re: Not able to compile '.java' files

Posted by Prateek Jindal <ji...@illinois.edu>.
Hi Mikhail,

Thanks a lot for your clear response.

--Prateek.

On Sat, Feb 6, 2010 at 4:06 PM, Mikhail Yakshin <gr...@gmail.com>wrote:

> Hi,
>
> > 1) I have 3 files, namely MaxTemperature.java, MaxTemperatureMapper.java,
> > MaxTemperatureReducer.java. Now, I have to compile them to get the
> '.class'
> > files which would be used by 'hadoop' command. I tried the following:
> >
> > 'javac -cp .:/hadoop/lib MaxTemperatureMapper.java'
> >
> > But it gives me the error that it doesn't recognize the packages '
> > org.apache.hadoop.io', 'org.apache.hadoop.mapred' and so on.
> >
> > Can someone suggest something about that?
>
> Well, for a start, adding "/hadoop/lib" to your classpath is:
>
> 1) Literally, yeah, "/hadoop/lib", which means that if will try to
> find /hadoop in the root of your filesystem, then it will try to find
> "lib" in it. If you meant "hadoop" subdirectory in your current
> directory, use "hadoop/lib", not "/hadoop/lib".
>
> 2) You have to add individual jars in classpath, i.e.
> "hadoop/lib/hadoop-0.18.3-core.jar:hadoop/lib/hadoop-0.18.3-test.jar:...",
> etc. In recent versions of JDK, you can also specify "hadoop/lib/*",
> but make sure you add quotes on this or escape "*" somehow, so it
> won't get expanded with your shell.
>
> > 2) Also, do we have to make the '.class' files by ourselves necessarily.
> Or
> > is it somehow possible that hadoop will make .class files by itself (from
> > the .java source files)?
>
> Nope, Hadoop is not a Java compiler. Moreover, it's generally
> insufficient to just create .class files - Hadoop generally operates
> with .class files packed in .jar archives, which is a standard way of
> software distribution for Java.
>
> Running compilers, packers, etc, by hand is barely a good idea.
> Consider using "ant" for building or using an IDE, such as Eclipse,
> NetBeans or IDEA.
>
> --
> WBR, Mikhail Yakshin
>

Re: Not able to compile '.java' files

Posted by Mikhail Yakshin <gr...@gmail.com>.
Hi,

> 1) I have 3 files, namely MaxTemperature.java, MaxTemperatureMapper.java,
> MaxTemperatureReducer.java. Now, I have to compile them to get the '.class'
> files which would be used by 'hadoop' command. I tried the following:
>
> 'javac -cp .:/hadoop/lib MaxTemperatureMapper.java'
>
> But it gives me the error that it doesn't recognize the packages '
> org.apache.hadoop.io', 'org.apache.hadoop.mapred' and so on.
>
> Can someone suggest something about that?

Well, for a start, adding "/hadoop/lib" to your classpath is:

1) Literally, yeah, "/hadoop/lib", which means that if will try to
find /hadoop in the root of your filesystem, then it will try to find
"lib" in it. If you meant "hadoop" subdirectory in your current
directory, use "hadoop/lib", not "/hadoop/lib".

2) You have to add individual jars in classpath, i.e.
"hadoop/lib/hadoop-0.18.3-core.jar:hadoop/lib/hadoop-0.18.3-test.jar:...",
etc. In recent versions of JDK, you can also specify "hadoop/lib/*",
but make sure you add quotes on this or escape "*" somehow, so it
won't get expanded with your shell.

> 2) Also, do we have to make the '.class' files by ourselves necessarily. Or
> is it somehow possible that hadoop will make .class files by itself (from
> the .java source files)?

Nope, Hadoop is not a Java compiler. Moreover, it's generally
insufficient to just create .class files - Hadoop generally operates
with .class files packed in .jar archives, which is a standard way of
software distribution for Java.

Running compilers, packers, etc, by hand is barely a good idea.
Consider using "ant" for building or using an IDE, such as Eclipse,
NetBeans or IDEA.

-- 
WBR, Mikhail Yakshin