You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by rab ra <ra...@gmail.com> on 2015/01/16 18:53:09 UTC

Launching Hadoop map reduce job from a servlet

Hello,

I have a servlet program deployed in jetty server listening to the port
8080. As soon as a request arrives from a client, it parses the request,
and instantiate MR program that is to be launched in Hadoop cluster. Here,
I cannot launch the hadoop job using hadoop command as 'hadoop jar <jar
file> .....". From servlet code, I instantiate the MR main program that
implements Tools and contains Mapper and Reducer classes in it.

My issue is that though the job is launched, it always uses Local
JobRunner. I do have hadoop installed with all the configuration files
contains right information in it. For instance, in my mapred-site.xml, i
have setup 'yarn' as my mapreduce framework.

With the current configuration setup, I was able to submit jobs to yarn
through hadoop command. But i want to achieve this through 'java' command.

1. How can I do it? If there is any pointer/link, please share it.
2. I tried to setup all the configuration inside the code something like as
below
....
conf.set("mapreduce,framework.name","yarn");
....
....

But somehow, it seems that these information is not cascading to job
despite creating job instance with the above configuration. So, I am
struggling to make hadoop configuration to java application.

I would be grateful to you for any help to fix this issue


regards
rab

Re: Launching Hadoop map reduce job from a servlet

Posted by Ravi Prakash <ra...@ymail.com>.
Hi Rab!
I think you have a comma in between mapreduce and framework.name where it should be a period. You can also try looking at the job's logs to see if the configuration for mapreduce.framwork.name was indeed passed or not.

HTH
 

     On Friday, January 16, 2015 9:55 AM, rab ra <ra...@gmail.com> wrote:
   

 Hello,
I have a servlet program deployed in jetty server listening to the port 8080. As soon as a request arrives from a client, it parses the request, and instantiate MR program that is to be launched in Hadoop cluster. Here, I cannot launch the hadoop job using hadoop command as 'hadoop jar <jar file> .....". From servlet code, I instantiate the MR main program that implements Tools and contains Mapper and Reducer classes in it.
My issue is that though the job is launched, it always uses Local JobRunner. I do have hadoop installed with all the configuration files contains right information in it. For instance, in my mapred-site.xml, i have setup 'yarn' as my mapreduce framework.
With the current configuration setup, I was able to submit jobs to yarn through hadoop command. But i want to achieve this through 'java' command.
1. How can I do it? If there is any pointer/link, please share it.2. I tried to setup all the configuration inside the code something like as below....conf.set("mapreduce,framework.name","yarn");........
But somehow, it seems that these information is not cascading to job despite creating job instance with the above configuration. So, I am struggling to make hadoop configuration to java application. 
I would be grateful to you for any help to fix this issue

regardsrab




   

Re: Launching Hadoop map reduce job from a servlet

Posted by Ravi Prakash <ra...@ymail.com>.
Hi Rab!
I think you have a comma in between mapreduce and framework.name where it should be a period. You can also try looking at the job's logs to see if the configuration for mapreduce.framwork.name was indeed passed or not.

HTH
 

     On Friday, January 16, 2015 9:55 AM, rab ra <ra...@gmail.com> wrote:
   

 Hello,
I have a servlet program deployed in jetty server listening to the port 8080. As soon as a request arrives from a client, it parses the request, and instantiate MR program that is to be launched in Hadoop cluster. Here, I cannot launch the hadoop job using hadoop command as 'hadoop jar <jar file> .....". From servlet code, I instantiate the MR main program that implements Tools and contains Mapper and Reducer classes in it.
My issue is that though the job is launched, it always uses Local JobRunner. I do have hadoop installed with all the configuration files contains right information in it. For instance, in my mapred-site.xml, i have setup 'yarn' as my mapreduce framework.
With the current configuration setup, I was able to submit jobs to yarn through hadoop command. But i want to achieve this through 'java' command.
1. How can I do it? If there is any pointer/link, please share it.2. I tried to setup all the configuration inside the code something like as below....conf.set("mapreduce,framework.name","yarn");........
But somehow, it seems that these information is not cascading to job despite creating job instance with the above configuration. So, I am struggling to make hadoop configuration to java application. 
I would be grateful to you for any help to fix this issue

regardsrab




   

Re: Launching Hadoop map reduce job from a servlet

Posted by Ravi Prakash <ra...@ymail.com>.
Hi Rab!
I think you have a comma in between mapreduce and framework.name where it should be a period. You can also try looking at the job's logs to see if the configuration for mapreduce.framwork.name was indeed passed or not.

HTH
 

     On Friday, January 16, 2015 9:55 AM, rab ra <ra...@gmail.com> wrote:
   

 Hello,
I have a servlet program deployed in jetty server listening to the port 8080. As soon as a request arrives from a client, it parses the request, and instantiate MR program that is to be launched in Hadoop cluster. Here, I cannot launch the hadoop job using hadoop command as 'hadoop jar <jar file> .....". From servlet code, I instantiate the MR main program that implements Tools and contains Mapper and Reducer classes in it.
My issue is that though the job is launched, it always uses Local JobRunner. I do have hadoop installed with all the configuration files contains right information in it. For instance, in my mapred-site.xml, i have setup 'yarn' as my mapreduce framework.
With the current configuration setup, I was able to submit jobs to yarn through hadoop command. But i want to achieve this through 'java' command.
1. How can I do it? If there is any pointer/link, please share it.2. I tried to setup all the configuration inside the code something like as below....conf.set("mapreduce,framework.name","yarn");........
But somehow, it seems that these information is not cascading to job despite creating job instance with the above configuration. So, I am struggling to make hadoop configuration to java application. 
I would be grateful to you for any help to fix this issue

regardsrab




   

Re: Launching Hadoop map reduce job from a servlet

Posted by Ravi Prakash <ra...@ymail.com>.
Hi Rab!
I think you have a comma in between mapreduce and framework.name where it should be a period. You can also try looking at the job's logs to see if the configuration for mapreduce.framwork.name was indeed passed or not.

HTH
 

     On Friday, January 16, 2015 9:55 AM, rab ra <ra...@gmail.com> wrote:
   

 Hello,
I have a servlet program deployed in jetty server listening to the port 8080. As soon as a request arrives from a client, it parses the request, and instantiate MR program that is to be launched in Hadoop cluster. Here, I cannot launch the hadoop job using hadoop command as 'hadoop jar <jar file> .....". From servlet code, I instantiate the MR main program that implements Tools and contains Mapper and Reducer classes in it.
My issue is that though the job is launched, it always uses Local JobRunner. I do have hadoop installed with all the configuration files contains right information in it. For instance, in my mapred-site.xml, i have setup 'yarn' as my mapreduce framework.
With the current configuration setup, I was able to submit jobs to yarn through hadoop command. But i want to achieve this through 'java' command.
1. How can I do it? If there is any pointer/link, please share it.2. I tried to setup all the configuration inside the code something like as below....conf.set("mapreduce,framework.name","yarn");........
But somehow, it seems that these information is not cascading to job despite creating job instance with the above configuration. So, I am struggling to make hadoop configuration to java application. 
I would be grateful to you for any help to fix this issue

regardsrab