You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Ravi Kiran <ra...@gmail.com> on 2013/08/23 07:28:32 UTC

Re: is it possible to run a executable jar with ClientAPI?

Hi ,
    You can definitely run the Driver (ClassWithMain) to a remote hadoop
cluster from say Eclipse following the steps under
a) Have the jar (Some.jar) in your classpath of your project in Eclipse .
b) Ensure you have set both the Namenode and Job Tracker information either
in core-site.xml and mapred-site.xml or through conf.setXXXX
c) In the main method of the Driver class havet the following ,  Below, *hdfs
*is a user who has permissions to run jobs on the hadoop cluster.

      public static void main (
final String args[])
{

int status = 0;

try
{

UserGroupInformation ugi =
UserGroupInformation.createRemoteUser("*hdfs*");
status = ugi.doAs(new PrivilegedExceptionAction<Integer>()
{

@Override
public Integer run ()
throws Exception
{

int result = ToolRunner.run(new Driver(), args);
return result;
}
});

System.exit(status);
                } catch(...){}

.

Regards
Ravi.


On Fri, Aug 23, 2013 at 9:37 AM, 정재부 <it...@samsung.com> wrote:

>  I commonly make a executable jar package with a main method and run by
> the commandline "hadoop jar Some.jar ClassWithMain input output"
>
> In this main method, Job and Configuration may be configured and
> Configuration class has a setter to specify mapper or reducer class like
> conf.setMapperClass(Mapper.class).
>
> However, In the case of submitting job remotely, I should set jar and
> Mapper or more classes to use hadoop client api.
>
> I want to programmatically transfer jar in client to remote hadoop cluster
> and execute this jar like "hadoop jar" command to make main method specify
> mapper and reducer.
>
> So how can I deal with this problem?
>
>
>
>
>
>
>