You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Dipesh Khakhkhar <di...@gmail.com> on 2012/10/26 01:10:18 UTC

Unsatisfied link error - how to load native library without copying it in /lib/native folder

Hi,

I am a new hadoop user and have few very basic questions (they might sound
very stupid to many people so please bear with me).

I am running a MR task and my launcher program needs to load a library
using System.loadLibrary(somelibrary). This works fine if I put this
library in lib/native/Linux-amd64-64. I tried the following -

1. provided -files=/path_to_directory_containging_my_library
2. provided the following in mapred-site.xml (didn't try it in
core-site.xml or hdfs-site.xml)

-Djava.library.path=//path_to_directory_containging_my_library

I'm using hadoop 1.0.3 and this is a single node cluster for testing
purpose.

I have a production environment where I'm running 4 data nodes and
currently I'm copying this file in  lib/native/Linux-amd64-64 folder in
each node's hadoop installation.

A related question regarding providing jars required for running the whole
M/R application - currently I have edited hadoop-classpath variable in
hadoop-env.sh. For cluster if I provide -libjars option will that work
without editing classpath? I require this jar's classes before launching
M/R jobs.

Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
lib directory of hadoop installation.

Thanks in advance for answering my queries.

Thanks.

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Dipesh Khakhkhar <di...@gmail.com>.
Yes, I am trying to use both (classes from my jar file and the native
library) before submitting job to the cluster.

Everything works, if i put native library in lib/native/Linux-amd64-64
folder and add path to my jar in hadoop-env.sh

I thought -files/-archives/-libjars options will be very useful in running
jobs on every data node without any need to copy these jars and libraries
to each of the node and that too in the hadoop folders.

Thanks.


On Thu, Oct 25, 2012 at 5:59 PM, Brock Noland <br...@cloudera.com> wrote:

> 1) Does your local program use the native library before submitting
> the job to the cluster?
>
> Here is an example of using native code in MR
> https://github.com/brockn/hadoop-thumbnail
>
> 2) I thought libjars would work for local classpath issues as well as
> remove. However, to add the jar to your local classpath as well you
> can:
>
> env HADOOP_CLASSPATH=my.jar hadoop jar ...
>
> Brock
>
>
> On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
> <di...@gmail.com> wrote:
> > Thanks for answering my query.
> >
> > 1. I have tried -files path _o_my_libary.so while invoking my MR
> application
> > but I still UnsatisfiedLinkError: no mylibrary in java.library.path
> >
> > 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> > provide -libjars path_to_myfile.jar and tried running my MR application
> > (bin/hadoop jar......) but it failed to load class from the jar file
> > mentioned in libjars path. I'm using this classes from this jar before
> > launching my M/R jobs.
> >
> > Unfortunately above methods didn't work for me.
> >
> > Thanks.
> >
> >
> > On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com>
> wrote:
> >>
> >> Hi,
> >>
> >> That should be:
> >>
> >> -files path_to_my_library.so
> >>
> >> and to include jars in for your mrjobs, you would do:
> >>
> >> 2) -libjars path_to_my1.jar,path_to_my2.jar
> >>
> >> Brock
> >>
> >> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> >> <di...@gmail.com> wrote:
> >> > Hi,
> >> >
> >> > I am a new hadoop user and have few very basic questions (they might
> >> > sound
> >> > very stupid to many people so please bear with me).
> >> >
> >> > I am running a MR task and my launcher program needs to load a library
> >> > using
> >> > System.loadLibrary(somelibrary). This works fine if I put this library
> >> > in
> >> > lib/native/Linux-amd64-64. I tried the following -
> >> >
> >> > 1. provided -files=/path_to_directory_containging_my_library
> >> > 2. provided the following in mapred-site.xml (didn't try it in
> >> > core-site.xml
> >> > or hdfs-site.xml)
> >> >
> >> > -Djava.library.path=//path_to_directory_containging_my_library
> >> >
> >> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> >> > purpose.
> >> >
> >> > I have a production environment where I'm running 4 data nodes and
> >> > currently
> >> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
> >> > node's
> >> > hadoop installation.
> >> >
> >> > A related question regarding providing jars required for running the
> >> > whole
> >> > M/R application - currently I have edited hadoop-classpath variable in
> >> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> >> > without editing classpath? I require this jar's classes before
> launching
> >> > M/R
> >> > jobs.
> >> >
> >> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> >> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
> >> > the
> >> > lib directory of hadoop installation.
> >> >
> >> > Thanks in advance for answering my queries.
> >> >
> >> > Thanks.
> >>
> >>
> >>
> >> --
> >> Apache MRUnit - Unit testing MapReduce -
> >> http://incubator.apache.org/mrunit/
> >
> >
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Dipesh Khakhkhar <di...@gmail.com>.
Yes, I am trying to use both (classes from my jar file and the native
library) before submitting job to the cluster.

Everything works, if i put native library in lib/native/Linux-amd64-64
folder and add path to my jar in hadoop-env.sh

I thought -files/-archives/-libjars options will be very useful in running
jobs on every data node without any need to copy these jars and libraries
to each of the node and that too in the hadoop folders.

Thanks.


On Thu, Oct 25, 2012 at 5:59 PM, Brock Noland <br...@cloudera.com> wrote:

> 1) Does your local program use the native library before submitting
> the job to the cluster?
>
> Here is an example of using native code in MR
> https://github.com/brockn/hadoop-thumbnail
>
> 2) I thought libjars would work for local classpath issues as well as
> remove. However, to add the jar to your local classpath as well you
> can:
>
> env HADOOP_CLASSPATH=my.jar hadoop jar ...
>
> Brock
>
>
> On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
> <di...@gmail.com> wrote:
> > Thanks for answering my query.
> >
> > 1. I have tried -files path _o_my_libary.so while invoking my MR
> application
> > but I still UnsatisfiedLinkError: no mylibrary in java.library.path
> >
> > 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> > provide -libjars path_to_myfile.jar and tried running my MR application
> > (bin/hadoop jar......) but it failed to load class from the jar file
> > mentioned in libjars path. I'm using this classes from this jar before
> > launching my M/R jobs.
> >
> > Unfortunately above methods didn't work for me.
> >
> > Thanks.
> >
> >
> > On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com>
> wrote:
> >>
> >> Hi,
> >>
> >> That should be:
> >>
> >> -files path_to_my_library.so
> >>
> >> and to include jars in for your mrjobs, you would do:
> >>
> >> 2) -libjars path_to_my1.jar,path_to_my2.jar
> >>
> >> Brock
> >>
> >> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> >> <di...@gmail.com> wrote:
> >> > Hi,
> >> >
> >> > I am a new hadoop user and have few very basic questions (they might
> >> > sound
> >> > very stupid to many people so please bear with me).
> >> >
> >> > I am running a MR task and my launcher program needs to load a library
> >> > using
> >> > System.loadLibrary(somelibrary). This works fine if I put this library
> >> > in
> >> > lib/native/Linux-amd64-64. I tried the following -
> >> >
> >> > 1. provided -files=/path_to_directory_containging_my_library
> >> > 2. provided the following in mapred-site.xml (didn't try it in
> >> > core-site.xml
> >> > or hdfs-site.xml)
> >> >
> >> > -Djava.library.path=//path_to_directory_containging_my_library
> >> >
> >> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> >> > purpose.
> >> >
> >> > I have a production environment where I'm running 4 data nodes and
> >> > currently
> >> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
> >> > node's
> >> > hadoop installation.
> >> >
> >> > A related question regarding providing jars required for running the
> >> > whole
> >> > M/R application - currently I have edited hadoop-classpath variable in
> >> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> >> > without editing classpath? I require this jar's classes before
> launching
> >> > M/R
> >> > jobs.
> >> >
> >> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> >> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
> >> > the
> >> > lib directory of hadoop installation.
> >> >
> >> > Thanks in advance for answering my queries.
> >> >
> >> > Thanks.
> >>
> >>
> >>
> >> --
> >> Apache MRUnit - Unit testing MapReduce -
> >> http://incubator.apache.org/mrunit/
> >
> >
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Dipesh Khakhkhar <di...@gmail.com>.
Yes, I am trying to use both (classes from my jar file and the native
library) before submitting job to the cluster.

Everything works, if i put native library in lib/native/Linux-amd64-64
folder and add path to my jar in hadoop-env.sh

I thought -files/-archives/-libjars options will be very useful in running
jobs on every data node without any need to copy these jars and libraries
to each of the node and that too in the hadoop folders.

Thanks.


On Thu, Oct 25, 2012 at 5:59 PM, Brock Noland <br...@cloudera.com> wrote:

> 1) Does your local program use the native library before submitting
> the job to the cluster?
>
> Here is an example of using native code in MR
> https://github.com/brockn/hadoop-thumbnail
>
> 2) I thought libjars would work for local classpath issues as well as
> remove. However, to add the jar to your local classpath as well you
> can:
>
> env HADOOP_CLASSPATH=my.jar hadoop jar ...
>
> Brock
>
>
> On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
> <di...@gmail.com> wrote:
> > Thanks for answering my query.
> >
> > 1. I have tried -files path _o_my_libary.so while invoking my MR
> application
> > but I still UnsatisfiedLinkError: no mylibrary in java.library.path
> >
> > 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> > provide -libjars path_to_myfile.jar and tried running my MR application
> > (bin/hadoop jar......) but it failed to load class from the jar file
> > mentioned in libjars path. I'm using this classes from this jar before
> > launching my M/R jobs.
> >
> > Unfortunately above methods didn't work for me.
> >
> > Thanks.
> >
> >
> > On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com>
> wrote:
> >>
> >> Hi,
> >>
> >> That should be:
> >>
> >> -files path_to_my_library.so
> >>
> >> and to include jars in for your mrjobs, you would do:
> >>
> >> 2) -libjars path_to_my1.jar,path_to_my2.jar
> >>
> >> Brock
> >>
> >> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> >> <di...@gmail.com> wrote:
> >> > Hi,
> >> >
> >> > I am a new hadoop user and have few very basic questions (they might
> >> > sound
> >> > very stupid to many people so please bear with me).
> >> >
> >> > I am running a MR task and my launcher program needs to load a library
> >> > using
> >> > System.loadLibrary(somelibrary). This works fine if I put this library
> >> > in
> >> > lib/native/Linux-amd64-64. I tried the following -
> >> >
> >> > 1. provided -files=/path_to_directory_containging_my_library
> >> > 2. provided the following in mapred-site.xml (didn't try it in
> >> > core-site.xml
> >> > or hdfs-site.xml)
> >> >
> >> > -Djava.library.path=//path_to_directory_containging_my_library
> >> >
> >> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> >> > purpose.
> >> >
> >> > I have a production environment where I'm running 4 data nodes and
> >> > currently
> >> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
> >> > node's
> >> > hadoop installation.
> >> >
> >> > A related question regarding providing jars required for running the
> >> > whole
> >> > M/R application - currently I have edited hadoop-classpath variable in
> >> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> >> > without editing classpath? I require this jar's classes before
> launching
> >> > M/R
> >> > jobs.
> >> >
> >> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> >> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
> >> > the
> >> > lib directory of hadoop installation.
> >> >
> >> > Thanks in advance for answering my queries.
> >> >
> >> > Thanks.
> >>
> >>
> >>
> >> --
> >> Apache MRUnit - Unit testing MapReduce -
> >> http://incubator.apache.org/mrunit/
> >
> >
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Dipesh Khakhkhar <di...@gmail.com>.
Yes, I am trying to use both (classes from my jar file and the native
library) before submitting job to the cluster.

Everything works, if i put native library in lib/native/Linux-amd64-64
folder and add path to my jar in hadoop-env.sh

I thought -files/-archives/-libjars options will be very useful in running
jobs on every data node without any need to copy these jars and libraries
to each of the node and that too in the hadoop folders.

Thanks.


On Thu, Oct 25, 2012 at 5:59 PM, Brock Noland <br...@cloudera.com> wrote:

> 1) Does your local program use the native library before submitting
> the job to the cluster?
>
> Here is an example of using native code in MR
> https://github.com/brockn/hadoop-thumbnail
>
> 2) I thought libjars would work for local classpath issues as well as
> remove. However, to add the jar to your local classpath as well you
> can:
>
> env HADOOP_CLASSPATH=my.jar hadoop jar ...
>
> Brock
>
>
> On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
> <di...@gmail.com> wrote:
> > Thanks for answering my query.
> >
> > 1. I have tried -files path _o_my_libary.so while invoking my MR
> application
> > but I still UnsatisfiedLinkError: no mylibrary in java.library.path
> >
> > 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> > provide -libjars path_to_myfile.jar and tried running my MR application
> > (bin/hadoop jar......) but it failed to load class from the jar file
> > mentioned in libjars path. I'm using this classes from this jar before
> > launching my M/R jobs.
> >
> > Unfortunately above methods didn't work for me.
> >
> > Thanks.
> >
> >
> > On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com>
> wrote:
> >>
> >> Hi,
> >>
> >> That should be:
> >>
> >> -files path_to_my_library.so
> >>
> >> and to include jars in for your mrjobs, you would do:
> >>
> >> 2) -libjars path_to_my1.jar,path_to_my2.jar
> >>
> >> Brock
> >>
> >> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> >> <di...@gmail.com> wrote:
> >> > Hi,
> >> >
> >> > I am a new hadoop user and have few very basic questions (they might
> >> > sound
> >> > very stupid to many people so please bear with me).
> >> >
> >> > I am running a MR task and my launcher program needs to load a library
> >> > using
> >> > System.loadLibrary(somelibrary). This works fine if I put this library
> >> > in
> >> > lib/native/Linux-amd64-64. I tried the following -
> >> >
> >> > 1. provided -files=/path_to_directory_containging_my_library
> >> > 2. provided the following in mapred-site.xml (didn't try it in
> >> > core-site.xml
> >> > or hdfs-site.xml)
> >> >
> >> > -Djava.library.path=//path_to_directory_containging_my_library
> >> >
> >> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> >> > purpose.
> >> >
> >> > I have a production environment where I'm running 4 data nodes and
> >> > currently
> >> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
> >> > node's
> >> > hadoop installation.
> >> >
> >> > A related question regarding providing jars required for running the
> >> > whole
> >> > M/R application - currently I have edited hadoop-classpath variable in
> >> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> >> > without editing classpath? I require this jar's classes before
> launching
> >> > M/R
> >> > jobs.
> >> >
> >> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> >> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
> >> > the
> >> > lib directory of hadoop installation.
> >> >
> >> > Thanks in advance for answering my queries.
> >> >
> >> > Thanks.
> >>
> >>
> >>
> >> --
> >> Apache MRUnit - Unit testing MapReduce -
> >> http://incubator.apache.org/mrunit/
> >
> >
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Brock Noland <br...@cloudera.com>.
1) Does your local program use the native library before submitting
the job to the cluster?

Here is an example of using native code in MR
https://github.com/brockn/hadoop-thumbnail

2) I thought libjars would work for local classpath issues as well as
remove. However, to add the jar to your local classpath as well you
can:

env HADOOP_CLASSPATH=my.jar hadoop jar ...

Brock


On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
<di...@gmail.com> wrote:
> Thanks for answering my query.
>
> 1. I have tried -files path _o_my_libary.so while invoking my MR application
> but I still UnsatisfiedLinkError: no mylibrary in java.library.path
>
> 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> provide -libjars path_to_myfile.jar and tried running my MR application
> (bin/hadoop jar......) but it failed to load class from the jar file
> mentioned in libjars path. I'm using this classes from this jar before
> launching my M/R jobs.
>
> Unfortunately above methods didn't work for me.
>
> Thanks.
>
>
> On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:
>>
>> Hi,
>>
>> That should be:
>>
>> -files path_to_my_library.so
>>
>> and to include jars in for your mrjobs, you would do:
>>
>> 2) -libjars path_to_my1.jar,path_to_my2.jar
>>
>> Brock
>>
>> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
>> <di...@gmail.com> wrote:
>> > Hi,
>> >
>> > I am a new hadoop user and have few very basic questions (they might
>> > sound
>> > very stupid to many people so please bear with me).
>> >
>> > I am running a MR task and my launcher program needs to load a library
>> > using
>> > System.loadLibrary(somelibrary). This works fine if I put this library
>> > in
>> > lib/native/Linux-amd64-64. I tried the following -
>> >
>> > 1. provided -files=/path_to_directory_containging_my_library
>> > 2. provided the following in mapred-site.xml (didn't try it in
>> > core-site.xml
>> > or hdfs-site.xml)
>> >
>> > -Djava.library.path=//path_to_directory_containging_my_library
>> >
>> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
>> > purpose.
>> >
>> > I have a production environment where I'm running 4 data nodes and
>> > currently
>> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
>> > node's
>> > hadoop installation.
>> >
>> > A related question regarding providing jars required for running the
>> > whole
>> > M/R application - currently I have edited hadoop-classpath variable in
>> > hadoop-env.sh. For cluster if I provide -libjars option will that work
>> > without editing classpath? I require this jar's classes before launching
>> > M/R
>> > jobs.
>> >
>> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
>> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
>> > the
>> > lib directory of hadoop installation.
>> >
>> > Thanks in advance for answering my queries.
>> >
>> > Thanks.
>>
>>
>>
>> --
>> Apache MRUnit - Unit testing MapReduce -
>> http://incubator.apache.org/mrunit/
>
>



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Brock Noland <br...@cloudera.com>.
1) Does your local program use the native library before submitting
the job to the cluster?

Here is an example of using native code in MR
https://github.com/brockn/hadoop-thumbnail

2) I thought libjars would work for local classpath issues as well as
remove. However, to add the jar to your local classpath as well you
can:

env HADOOP_CLASSPATH=my.jar hadoop jar ...

Brock


On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
<di...@gmail.com> wrote:
> Thanks for answering my query.
>
> 1. I have tried -files path _o_my_libary.so while invoking my MR application
> but I still UnsatisfiedLinkError: no mylibrary in java.library.path
>
> 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> provide -libjars path_to_myfile.jar and tried running my MR application
> (bin/hadoop jar......) but it failed to load class from the jar file
> mentioned in libjars path. I'm using this classes from this jar before
> launching my M/R jobs.
>
> Unfortunately above methods didn't work for me.
>
> Thanks.
>
>
> On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:
>>
>> Hi,
>>
>> That should be:
>>
>> -files path_to_my_library.so
>>
>> and to include jars in for your mrjobs, you would do:
>>
>> 2) -libjars path_to_my1.jar,path_to_my2.jar
>>
>> Brock
>>
>> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
>> <di...@gmail.com> wrote:
>> > Hi,
>> >
>> > I am a new hadoop user and have few very basic questions (they might
>> > sound
>> > very stupid to many people so please bear with me).
>> >
>> > I am running a MR task and my launcher program needs to load a library
>> > using
>> > System.loadLibrary(somelibrary). This works fine if I put this library
>> > in
>> > lib/native/Linux-amd64-64. I tried the following -
>> >
>> > 1. provided -files=/path_to_directory_containging_my_library
>> > 2. provided the following in mapred-site.xml (didn't try it in
>> > core-site.xml
>> > or hdfs-site.xml)
>> >
>> > -Djava.library.path=//path_to_directory_containging_my_library
>> >
>> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
>> > purpose.
>> >
>> > I have a production environment where I'm running 4 data nodes and
>> > currently
>> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
>> > node's
>> > hadoop installation.
>> >
>> > A related question regarding providing jars required for running the
>> > whole
>> > M/R application - currently I have edited hadoop-classpath variable in
>> > hadoop-env.sh. For cluster if I provide -libjars option will that work
>> > without editing classpath? I require this jar's classes before launching
>> > M/R
>> > jobs.
>> >
>> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
>> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
>> > the
>> > lib directory of hadoop installation.
>> >
>> > Thanks in advance for answering my queries.
>> >
>> > Thanks.
>>
>>
>>
>> --
>> Apache MRUnit - Unit testing MapReduce -
>> http://incubator.apache.org/mrunit/
>
>



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Brock Noland <br...@cloudera.com>.
1) Does your local program use the native library before submitting
the job to the cluster?

Here is an example of using native code in MR
https://github.com/brockn/hadoop-thumbnail

2) I thought libjars would work for local classpath issues as well as
remove. However, to add the jar to your local classpath as well you
can:

env HADOOP_CLASSPATH=my.jar hadoop jar ...

Brock


On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
<di...@gmail.com> wrote:
> Thanks for answering my query.
>
> 1. I have tried -files path _o_my_libary.so while invoking my MR application
> but I still UnsatisfiedLinkError: no mylibrary in java.library.path
>
> 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> provide -libjars path_to_myfile.jar and tried running my MR application
> (bin/hadoop jar......) but it failed to load class from the jar file
> mentioned in libjars path. I'm using this classes from this jar before
> launching my M/R jobs.
>
> Unfortunately above methods didn't work for me.
>
> Thanks.
>
>
> On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:
>>
>> Hi,
>>
>> That should be:
>>
>> -files path_to_my_library.so
>>
>> and to include jars in for your mrjobs, you would do:
>>
>> 2) -libjars path_to_my1.jar,path_to_my2.jar
>>
>> Brock
>>
>> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
>> <di...@gmail.com> wrote:
>> > Hi,
>> >
>> > I am a new hadoop user and have few very basic questions (they might
>> > sound
>> > very stupid to many people so please bear with me).
>> >
>> > I am running a MR task and my launcher program needs to load a library
>> > using
>> > System.loadLibrary(somelibrary). This works fine if I put this library
>> > in
>> > lib/native/Linux-amd64-64. I tried the following -
>> >
>> > 1. provided -files=/path_to_directory_containging_my_library
>> > 2. provided the following in mapred-site.xml (didn't try it in
>> > core-site.xml
>> > or hdfs-site.xml)
>> >
>> > -Djava.library.path=//path_to_directory_containging_my_library
>> >
>> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
>> > purpose.
>> >
>> > I have a production environment where I'm running 4 data nodes and
>> > currently
>> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
>> > node's
>> > hadoop installation.
>> >
>> > A related question regarding providing jars required for running the
>> > whole
>> > M/R application - currently I have edited hadoop-classpath variable in
>> > hadoop-env.sh. For cluster if I provide -libjars option will that work
>> > without editing classpath? I require this jar's classes before launching
>> > M/R
>> > jobs.
>> >
>> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
>> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
>> > the
>> > lib directory of hadoop installation.
>> >
>> > Thanks in advance for answering my queries.
>> >
>> > Thanks.
>>
>>
>>
>> --
>> Apache MRUnit - Unit testing MapReduce -
>> http://incubator.apache.org/mrunit/
>
>



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Brock Noland <br...@cloudera.com>.
1) Does your local program use the native library before submitting
the job to the cluster?

Here is an example of using native code in MR
https://github.com/brockn/hadoop-thumbnail

2) I thought libjars would work for local classpath issues as well as
remove. However, to add the jar to your local classpath as well you
can:

env HADOOP_CLASSPATH=my.jar hadoop jar ...

Brock


On Thu, Oct 25, 2012 at 7:11 PM, Dipesh Khakhkhar
<di...@gmail.com> wrote:
> Thanks for answering my query.
>
> 1. I have tried -files path _o_my_libary.so while invoking my MR application
> but I still UnsatisfiedLinkError: no mylibrary in java.library.path
>
> 2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
> provide -libjars path_to_myfile.jar and tried running my MR application
> (bin/hadoop jar......) but it failed to load class from the jar file
> mentioned in libjars path. I'm using this classes from this jar before
> launching my M/R jobs.
>
> Unfortunately above methods didn't work for me.
>
> Thanks.
>
>
> On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:
>>
>> Hi,
>>
>> That should be:
>>
>> -files path_to_my_library.so
>>
>> and to include jars in for your mrjobs, you would do:
>>
>> 2) -libjars path_to_my1.jar,path_to_my2.jar
>>
>> Brock
>>
>> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
>> <di...@gmail.com> wrote:
>> > Hi,
>> >
>> > I am a new hadoop user and have few very basic questions (they might
>> > sound
>> > very stupid to many people so please bear with me).
>> >
>> > I am running a MR task and my launcher program needs to load a library
>> > using
>> > System.loadLibrary(somelibrary). This works fine if I put this library
>> > in
>> > lib/native/Linux-amd64-64. I tried the following -
>> >
>> > 1. provided -files=/path_to_directory_containging_my_library
>> > 2. provided the following in mapred-site.xml (didn't try it in
>> > core-site.xml
>> > or hdfs-site.xml)
>> >
>> > -Djava.library.path=//path_to_directory_containging_my_library
>> >
>> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
>> > purpose.
>> >
>> > I have a production environment where I'm running 4 data nodes and
>> > currently
>> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each
>> > node's
>> > hadoop installation.
>> >
>> > A related question regarding providing jars required for running the
>> > whole
>> > M/R application - currently I have edited hadoop-classpath variable in
>> > hadoop-env.sh. For cluster if I provide -libjars option will that work
>> > without editing classpath? I require this jar's classes before launching
>> > M/R
>> > jobs.
>> >
>> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
>> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in
>> > the
>> > lib directory of hadoop installation.
>> >
>> > Thanks in advance for answering my queries.
>> >
>> > Thanks.
>>
>>
>>
>> --
>> Apache MRUnit - Unit testing MapReduce -
>> http://incubator.apache.org/mrunit/
>
>



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Dipesh Khakhkhar <di...@gmail.com>.
Thanks for answering my query.

1. I have tried -files path _o_my_libary.so while invoking my MR
application but I still UnsatisfiedLinkError: no mylibrary in
java.library.path

2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
provide -libjars path_to_myfile.jar and tried running my MR application
(bin/hadoop jar......) but it failed to load class from the jar file
mentioned in libjars path. I'm using this classes from this jar before
launching my M/R jobs.

Unfortunately above methods didn't work for me.

Thanks.


On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:

> Hi,
>
> That should be:
>
> -files path_to_my_library.so
>
> and to include jars in for your mrjobs, you would do:
>
> 2) -libjars path_to_my1.jar,path_to_my2.jar
>
> Brock
>
> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> <di...@gmail.com> wrote:
> > Hi,
> >
> > I am a new hadoop user and have few very basic questions (they might
> sound
> > very stupid to many people so please bear with me).
> >
> > I am running a MR task and my launcher program needs to load a library
> using
> > System.loadLibrary(somelibrary). This works fine if I put this library in
> > lib/native/Linux-amd64-64. I tried the following -
> >
> > 1. provided -files=/path_to_directory_containging_my_library
> > 2. provided the following in mapred-site.xml (didn't try it in
> core-site.xml
> > or hdfs-site.xml)
> >
> > -Djava.library.path=//path_to_directory_containging_my_library
> >
> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> > purpose.
> >
> > I have a production environment where I'm running 4 data nodes and
> currently
> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> > hadoop installation.
> >
> > A related question regarding providing jars required for running the
> whole
> > M/R application - currently I have edited hadoop-classpath variable in
> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> > without editing classpath? I require this jar's classes before launching
> M/R
> > jobs.
> >
> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> > lib directory of hadoop installation.
> >
> > Thanks in advance for answering my queries.
> >
> > Thanks.
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Dipesh Khakhkhar <di...@gmail.com>.
Thanks for answering my query.

1. I have tried -files path _o_my_libary.so while invoking my MR
application but I still UnsatisfiedLinkError: no mylibrary in
java.library.path

2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
provide -libjars path_to_myfile.jar and tried running my MR application
(bin/hadoop jar......) but it failed to load class from the jar file
mentioned in libjars path. I'm using this classes from this jar before
launching my M/R jobs.

Unfortunately above methods didn't work for me.

Thanks.


On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:

> Hi,
>
> That should be:
>
> -files path_to_my_library.so
>
> and to include jars in for your mrjobs, you would do:
>
> 2) -libjars path_to_my1.jar,path_to_my2.jar
>
> Brock
>
> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> <di...@gmail.com> wrote:
> > Hi,
> >
> > I am a new hadoop user and have few very basic questions (they might
> sound
> > very stupid to many people so please bear with me).
> >
> > I am running a MR task and my launcher program needs to load a library
> using
> > System.loadLibrary(somelibrary). This works fine if I put this library in
> > lib/native/Linux-amd64-64. I tried the following -
> >
> > 1. provided -files=/path_to_directory_containging_my_library
> > 2. provided the following in mapred-site.xml (didn't try it in
> core-site.xml
> > or hdfs-site.xml)
> >
> > -Djava.library.path=//path_to_directory_containging_my_library
> >
> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> > purpose.
> >
> > I have a production environment where I'm running 4 data nodes and
> currently
> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> > hadoop installation.
> >
> > A related question regarding providing jars required for running the
> whole
> > M/R application - currently I have edited hadoop-classpath variable in
> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> > without editing classpath? I require this jar's classes before launching
> M/R
> > jobs.
> >
> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> > lib directory of hadoop installation.
> >
> > Thanks in advance for answering my queries.
> >
> > Thanks.
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Dipesh Khakhkhar <di...@gmail.com>.
Thanks for answering my query.

1. I have tried -files path _o_my_libary.so while invoking my MR
application but I still UnsatisfiedLinkError: no mylibrary in
java.library.path

2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
provide -libjars path_to_myfile.jar and tried running my MR application
(bin/hadoop jar......) but it failed to load class from the jar file
mentioned in libjars path. I'm using this classes from this jar before
launching my M/R jobs.

Unfortunately above methods didn't work for me.

Thanks.


On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:

> Hi,
>
> That should be:
>
> -files path_to_my_library.so
>
> and to include jars in for your mrjobs, you would do:
>
> 2) -libjars path_to_my1.jar,path_to_my2.jar
>
> Brock
>
> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> <di...@gmail.com> wrote:
> > Hi,
> >
> > I am a new hadoop user and have few very basic questions (they might
> sound
> > very stupid to many people so please bear with me).
> >
> > I am running a MR task and my launcher program needs to load a library
> using
> > System.loadLibrary(somelibrary). This works fine if I put this library in
> > lib/native/Linux-amd64-64. I tried the following -
> >
> > 1. provided -files=/path_to_directory_containging_my_library
> > 2. provided the following in mapred-site.xml (didn't try it in
> core-site.xml
> > or hdfs-site.xml)
> >
> > -Djava.library.path=//path_to_directory_containging_my_library
> >
> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> > purpose.
> >
> > I have a production environment where I'm running 4 data nodes and
> currently
> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> > hadoop installation.
> >
> > A related question regarding providing jars required for running the
> whole
> > M/R application - currently I have edited hadoop-classpath variable in
> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> > without editing classpath? I require this jar's classes before launching
> M/R
> > jobs.
> >
> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> > lib directory of hadoop installation.
> >
> > Thanks in advance for answering my queries.
> >
> > Thanks.
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Dipesh Khakhkhar <di...@gmail.com>.
Thanks for answering my query.

1. I have tried -files path _o_my_libary.so while invoking my MR
application but I still UnsatisfiedLinkError: no mylibrary in
java.library.path

2. I have removed path to my jar in hadoop-classpath in hadoop-env.sh and
provide -libjars path_to_myfile.jar and tried running my MR application
(bin/hadoop jar......) but it failed to load class from the jar file
mentioned in libjars path. I'm using this classes from this jar before
launching my M/R jobs.

Unfortunately above methods didn't work for me.

Thanks.


On Thu, Oct 25, 2012 at 4:50 PM, Brock Noland <br...@cloudera.com> wrote:

> Hi,
>
> That should be:
>
> -files path_to_my_library.so
>
> and to include jars in for your mrjobs, you would do:
>
> 2) -libjars path_to_my1.jar,path_to_my2.jar
>
> Brock
>
> On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
> <di...@gmail.com> wrote:
> > Hi,
> >
> > I am a new hadoop user and have few very basic questions (they might
> sound
> > very stupid to many people so please bear with me).
> >
> > I am running a MR task and my launcher program needs to load a library
> using
> > System.loadLibrary(somelibrary). This works fine if I put this library in
> > lib/native/Linux-amd64-64. I tried the following -
> >
> > 1. provided -files=/path_to_directory_containging_my_library
> > 2. provided the following in mapred-site.xml (didn't try it in
> core-site.xml
> > or hdfs-site.xml)
> >
> > -Djava.library.path=//path_to_directory_containging_my_library
> >
> > I'm using hadoop 1.0.3 and this is a single node cluster for testing
> > purpose.
> >
> > I have a production environment where I'm running 4 data nodes and
> currently
> > I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> > hadoop installation.
> >
> > A related question regarding providing jars required for running the
> whole
> > M/R application - currently I have edited hadoop-classpath variable in
> > hadoop-env.sh. For cluster if I provide -libjars option will that work
> > without editing classpath? I require this jar's classes before launching
> M/R
> > jobs.
> >
> > Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> > com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> > lib directory of hadoop installation.
> >
> > Thanks in advance for answering my queries.
> >
> > Thanks.
>
>
>
> --
> Apache MRUnit - Unit testing MapReduce -
> http://incubator.apache.org/mrunit/
>

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Brock Noland <br...@cloudera.com>.
Hi,

That should be:

-files path_to_my_library.so

and to include jars in for your mrjobs, you would do:

2) -libjars path_to_my1.jar,path_to_my2.jar

Brock

On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
<di...@gmail.com> wrote:
> Hi,
>
> I am a new hadoop user and have few very basic questions (they might sound
> very stupid to many people so please bear with me).
>
> I am running a MR task and my launcher program needs to load a library using
> System.loadLibrary(somelibrary). This works fine if I put this library in
> lib/native/Linux-amd64-64. I tried the following -
>
> 1. provided -files=/path_to_directory_containging_my_library
> 2. provided the following in mapred-site.xml (didn't try it in core-site.xml
> or hdfs-site.xml)
>
> -Djava.library.path=//path_to_directory_containging_my_library
>
> I'm using hadoop 1.0.3 and this is a single node cluster for testing
> purpose.
>
> I have a production environment where I'm running 4 data nodes and currently
> I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> hadoop installation.
>
> A related question regarding providing jars required for running the whole
> M/R application - currently I have edited hadoop-classpath variable in
> hadoop-env.sh. For cluster if I provide -libjars option will that work
> without editing classpath? I require this jar's classes before launching M/R
> jobs.
>
> Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> lib directory of hadoop installation.
>
> Thanks in advance for answering my queries.
>
> Thanks.



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Brock Noland <br...@cloudera.com>.
Hi,

That should be:

-files path_to_my_library.so

and to include jars in for your mrjobs, you would do:

2) -libjars path_to_my1.jar,path_to_my2.jar

Brock

On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
<di...@gmail.com> wrote:
> Hi,
>
> I am a new hadoop user and have few very basic questions (they might sound
> very stupid to many people so please bear with me).
>
> I am running a MR task and my launcher program needs to load a library using
> System.loadLibrary(somelibrary). This works fine if I put this library in
> lib/native/Linux-amd64-64. I tried the following -
>
> 1. provided -files=/path_to_directory_containging_my_library
> 2. provided the following in mapred-site.xml (didn't try it in core-site.xml
> or hdfs-site.xml)
>
> -Djava.library.path=//path_to_directory_containging_my_library
>
> I'm using hadoop 1.0.3 and this is a single node cluster for testing
> purpose.
>
> I have a production environment where I'm running 4 data nodes and currently
> I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> hadoop installation.
>
> A related question regarding providing jars required for running the whole
> M/R application - currently I have edited hadoop-classpath variable in
> hadoop-env.sh. For cluster if I provide -libjars option will that work
> without editing classpath? I require this jar's classes before launching M/R
> jobs.
>
> Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> lib directory of hadoop installation.
>
> Thanks in advance for answering my queries.
>
> Thanks.



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Brock Noland <br...@cloudera.com>.
Hi,

That should be:

-files path_to_my_library.so

and to include jars in for your mrjobs, you would do:

2) -libjars path_to_my1.jar,path_to_my2.jar

Brock

On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
<di...@gmail.com> wrote:
> Hi,
>
> I am a new hadoop user and have few very basic questions (they might sound
> very stupid to many people so please bear with me).
>
> I am running a MR task and my launcher program needs to load a library using
> System.loadLibrary(somelibrary). This works fine if I put this library in
> lib/native/Linux-amd64-64. I tried the following -
>
> 1. provided -files=/path_to_directory_containging_my_library
> 2. provided the following in mapred-site.xml (didn't try it in core-site.xml
> or hdfs-site.xml)
>
> -Djava.library.path=//path_to_directory_containging_my_library
>
> I'm using hadoop 1.0.3 and this is a single node cluster for testing
> purpose.
>
> I have a production environment where I'm running 4 data nodes and currently
> I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> hadoop installation.
>
> A related question regarding providing jars required for running the whole
> M/R application - currently I have edited hadoop-classpath variable in
> hadoop-env.sh. For cluster if I provide -libjars option will that work
> without editing classpath? I require this jar's classes before launching M/R
> jobs.
>
> Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> lib directory of hadoop installation.
>
> Thanks in advance for answering my queries.
>
> Thanks.



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Re: Unsatisfied link error - how to load native library without copying it in /lib/native folder

Posted by Brock Noland <br...@cloudera.com>.
Hi,

That should be:

-files path_to_my_library.so

and to include jars in for your mrjobs, you would do:

2) -libjars path_to_my1.jar,path_to_my2.jar

Brock

On Thu, Oct 25, 2012 at 6:10 PM, Dipesh Khakhkhar
<di...@gmail.com> wrote:
> Hi,
>
> I am a new hadoop user and have few very basic questions (they might sound
> very stupid to many people so please bear with me).
>
> I am running a MR task and my launcher program needs to load a library using
> System.loadLibrary(somelibrary). This works fine if I put this library in
> lib/native/Linux-amd64-64. I tried the following -
>
> 1. provided -files=/path_to_directory_containging_my_library
> 2. provided the following in mapred-site.xml (didn't try it in core-site.xml
> or hdfs-site.xml)
>
> -Djava.library.path=//path_to_directory_containging_my_library
>
> I'm using hadoop 1.0.3 and this is a single node cluster for testing
> purpose.
>
> I have a production environment where I'm running 4 data nodes and currently
> I'm copying this file in  lib/native/Linux-amd64-64 folder in each node's
> hadoop installation.
>
> A related question regarding providing jars required for running the whole
> M/R application - currently I have edited hadoop-classpath variable in
> hadoop-env.sh. For cluster if I provide -libjars option will that work
> without editing classpath? I require this jar's classes before launching M/R
> jobs.
>
> Also how can I provide my application jar ( i.e. bin/hadoop jar myjar
> com.x.x.ProgramName )  in the data nodes? Currently I'm copying it in the
> lib directory of hadoop installation.
>
> Thanks in advance for answering my queries.
>
> Thanks.



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/