You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Pankaj Gupta <pa...@brightroll.com> on 2012/11/21 05:49:29 UTC

Supplying a jar for a map-reduce job

Hi,

I am running map-reduce jobs on Hadoop 0.23 cluster. Right now I supply the jar to use for running the map-reduce job using the setJarByClass function on org.apache.hadoop.mapreduce.Job. This makes my code depend on a class in the MR job at compile. What I want is to be able to run an MR job without being dependent on it at compile time. It would be great if I could use a jar that contains the Mapper and Reducer classes and just pass it to run the map reduce job. That would make it easy to choose an MR job to run at runtime. Is that possible?


Thanks in Advance,
Pankaj

Re: Supplying a jar for a map-reduce job

Posted by Bejoy KS <be...@gmail.com>.
Hi Pankaj

AFAIK You can do the same. Just provide the properties like mapper class, reducer class, input format, output format etc using -D option at run time.



Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Pankaj Gupta <pa...@brightroll.com>
Date: Tue, 20 Nov 2012 20:49:29 
To: user@hadoop.apache.org<us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Supplying a jar for a map-reduce job

Hi,

I am running map-reduce jobs on Hadoop 0.23 cluster. Right now I supply the jar to use for running the map-reduce job using the setJarByClass function on org.apache.hadoop.mapreduce.Job. This makes my code depend on a class in the MR job at compile. What I want is to be able to run an MR job without being dependent on it at compile time. It would be great if I could use a jar that contains the Mapper and Reducer classes and just pass it to run the map reduce job. That would make it easy to choose an MR job to run at runtime. Is that possible?


Thanks in Advance,
Pankaj

Re: Supplying a jar for a map-reduce job

Posted by Bejoy KS <be...@gmail.com>.
Hi Pankaj

AFAIK You can do the same. Just provide the properties like mapper class, reducer class, input format, output format etc using -D option at run time.



Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Pankaj Gupta <pa...@brightroll.com>
Date: Tue, 20 Nov 2012 20:49:29 
To: user@hadoop.apache.org<us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Supplying a jar for a map-reduce job

Hi,

I am running map-reduce jobs on Hadoop 0.23 cluster. Right now I supply the jar to use for running the map-reduce job using the setJarByClass function on org.apache.hadoop.mapreduce.Job. This makes my code depend on a class in the MR job at compile. What I want is to be able to run an MR job without being dependent on it at compile time. It would be great if I could use a jar that contains the Mapper and Reducer classes and just pass it to run the map reduce job. That would make it easy to choose an MR job to run at runtime. Is that possible?


Thanks in Advance,
Pankaj

Re: Supplying a jar for a map-reduce job

Posted by Bejoy KS <be...@gmail.com>.
Hi Pankaj

AFAIK You can do the same. Just provide the properties like mapper class, reducer class, input format, output format etc using -D option at run time.



Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Pankaj Gupta <pa...@brightroll.com>
Date: Tue, 20 Nov 2012 20:49:29 
To: user@hadoop.apache.org<us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Supplying a jar for a map-reduce job

Hi,

I am running map-reduce jobs on Hadoop 0.23 cluster. Right now I supply the jar to use for running the map-reduce job using the setJarByClass function on org.apache.hadoop.mapreduce.Job. This makes my code depend on a class in the MR job at compile. What I want is to be able to run an MR job without being dependent on it at compile time. It would be great if I could use a jar that contains the Mapper and Reducer classes and just pass it to run the map reduce job. That would make it easy to choose an MR job to run at runtime. Is that possible?


Thanks in Advance,
Pankaj

Re: Supplying a jar for a map-reduce job

Posted by Bejoy KS <be...@gmail.com>.
Hi Pankaj

AFAIK You can do the same. Just provide the properties like mapper class, reducer class, input format, output format etc using -D option at run time.



Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Pankaj Gupta <pa...@brightroll.com>
Date: Tue, 20 Nov 2012 20:49:29 
To: user@hadoop.apache.org<us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Supplying a jar for a map-reduce job

Hi,

I am running map-reduce jobs on Hadoop 0.23 cluster. Right now I supply the jar to use for running the map-reduce job using the setJarByClass function on org.apache.hadoop.mapreduce.Job. This makes my code depend on a class in the MR job at compile. What I want is to be able to run an MR job without being dependent on it at compile time. It would be great if I could use a jar that contains the Mapper and Reducer classes and just pass it to run the map reduce job. That would make it easy to choose an MR job to run at runtime. Is that possible?


Thanks in Advance,
Pankaj