You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by fateme Abiri <fa...@yahoo.com> on 2013/11/03 15:57:47 UTC

submit job to remote hadoop cluster

Hi

I want to run a map reduce job from an IDE(NetBeans) and as a client, my hadoop  cluster is in different machine, so I set below code in my job conf to run a job in remote :

      
 config = new Configuration();
 config.set("fs.default.name", "hdfs://192.168.20.130:9000");
 config.set("mapred.job.tracker", "192.168.20.130:9001");
Job job = new Job(config, "myJob");  
job.setJarByClass(MyMapReducClass.class); 
job.setMapperClass(Mapper.class);
job.setReducerClass(Reducer.class);
…...........


 my Reducer.class &  Reducer.class is in  MyMapReducClass.class...

know I want to run my code directly from IDE(NetBeans) , without using command: 
$hadoop jar ….
I mean I use ToolRunner.run(new MyMapReducClass() , args) in my code.


but when I run the code I have this Error:

 attempt_201311031101_0008_m_000008_0, Status : FAILED
java.lang.RuntimeException: java.lang.ClassNotFoundException:
 MyMapReducClass$Mapper

 as I said I do : job.setJarByClass(MyMapReducClass.class);
and the Mapper.class is in  MyMapReducClass!!!!


If its necessary to set $Hadoop ClassPath, is it possible to set it from my IDE(NetBeans)? Because of some security reason I don't have any permission to ssh to hadoop cluster to set this parameter every time for all program execution !!!

in general: 

what pre-requirestics should I do  to run map-reduce job from IDE(NetBeans) in mylocal machine on remote cluster?
Should I copy the jar file of  my classes and library in hadoop cluster? I have them only in my local machine and do not copy them on hadoop cluster!!!

please say me what can I do?

– Tanx so much