You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Krishna Rao <kr...@gmail.com> on 2013/01/04 15:40:46 UTC

Possible to run an application jar as a hadoop daemon?

Hi al,

I have a java application jar that converts some files and writes directly
into hdfs.

If I want to run the jar I need to run it using "hadoop jar <application
jar>", so that it can access HDFS (that is running "java -jar <application
jar> results in a HDFS error").

Is it possible to run an jar as a hadoop daemon?

Cheers,

Krishna

Re: Possible to run an application jar as a hadoop daemon?

Posted by Krishna Rao <kr...@gmail.com>.
Thanks for the replies.

Harsh J, "hadoop classpath" was exactly what I needed. Got it working now.

Cheers,

Krishna

On 6 January 2013 11:14, John Hancock <jh...@gmail.com> wrote:

> Krishna,
>
> You should be able to take the command you are using to start the hadoop
> job (hadoop jar ..) and paste it into a text file.  Then make the file
> executable and call it as a shell script in a CRON job (crontab -e).  To be
> safe, use absolute paths to reference any files in the command.
>
> Or, I suppose what you crazy kids and your object oriented programming
> would do is use Quartz.
>
>
> -John
>
> On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
> chitreshdeshpande@gmail.com> wrote:
>
>> Hi Krishna,
>>
>> I dont know what do you mean by Hadoop daemon, but if you mean run when
>> all the other hadoop daemons like namenode, datanode etc are started, then
>> you can change start-all file in conf directory.
>>
>> Thanks and Regards,
>> Chitresh Deshpande
>>
>>
>> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com>wrote:
>>
>>> Hi al,
>>>
>>> I have a java application jar that converts some files and writes
>>> directly into hdfs.
>>>
>>> If I want to run the jar I need to run it using "hadoop jar <application
>>> jar>", so that it can access HDFS (that is running "java -jar <application
>>> jar> results in a HDFS error").
>>>
>>> Is it possible to run an jar as a hadoop daemon?
>>>
>>> Cheers,
>>>
>>> Krishna
>>>
>>
>>
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Krishna Rao <kr...@gmail.com>.
Thanks for the replies.

Harsh J, "hadoop classpath" was exactly what I needed. Got it working now.

Cheers,

Krishna

On 6 January 2013 11:14, John Hancock <jh...@gmail.com> wrote:

> Krishna,
>
> You should be able to take the command you are using to start the hadoop
> job (hadoop jar ..) and paste it into a text file.  Then make the file
> executable and call it as a shell script in a CRON job (crontab -e).  To be
> safe, use absolute paths to reference any files in the command.
>
> Or, I suppose what you crazy kids and your object oriented programming
> would do is use Quartz.
>
>
> -John
>
> On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
> chitreshdeshpande@gmail.com> wrote:
>
>> Hi Krishna,
>>
>> I dont know what do you mean by Hadoop daemon, but if you mean run when
>> all the other hadoop daemons like namenode, datanode etc are started, then
>> you can change start-all file in conf directory.
>>
>> Thanks and Regards,
>> Chitresh Deshpande
>>
>>
>> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com>wrote:
>>
>>> Hi al,
>>>
>>> I have a java application jar that converts some files and writes
>>> directly into hdfs.
>>>
>>> If I want to run the jar I need to run it using "hadoop jar <application
>>> jar>", so that it can access HDFS (that is running "java -jar <application
>>> jar> results in a HDFS error").
>>>
>>> Is it possible to run an jar as a hadoop daemon?
>>>
>>> Cheers,
>>>
>>> Krishna
>>>
>>
>>
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Krishna Rao <kr...@gmail.com>.
Thanks for the replies.

Harsh J, "hadoop classpath" was exactly what I needed. Got it working now.

Cheers,

Krishna

On 6 January 2013 11:14, John Hancock <jh...@gmail.com> wrote:

> Krishna,
>
> You should be able to take the command you are using to start the hadoop
> job (hadoop jar ..) and paste it into a text file.  Then make the file
> executable and call it as a shell script in a CRON job (crontab -e).  To be
> safe, use absolute paths to reference any files in the command.
>
> Or, I suppose what you crazy kids and your object oriented programming
> would do is use Quartz.
>
>
> -John
>
> On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
> chitreshdeshpande@gmail.com> wrote:
>
>> Hi Krishna,
>>
>> I dont know what do you mean by Hadoop daemon, but if you mean run when
>> all the other hadoop daemons like namenode, datanode etc are started, then
>> you can change start-all file in conf directory.
>>
>> Thanks and Regards,
>> Chitresh Deshpande
>>
>>
>> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com>wrote:
>>
>>> Hi al,
>>>
>>> I have a java application jar that converts some files and writes
>>> directly into hdfs.
>>>
>>> If I want to run the jar I need to run it using "hadoop jar <application
>>> jar>", so that it can access HDFS (that is running "java -jar <application
>>> jar> results in a HDFS error").
>>>
>>> Is it possible to run an jar as a hadoop daemon?
>>>
>>> Cheers,
>>>
>>> Krishna
>>>
>>
>>
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Krishna Rao <kr...@gmail.com>.
Thanks for the replies.

Harsh J, "hadoop classpath" was exactly what I needed. Got it working now.

Cheers,

Krishna

On 6 January 2013 11:14, John Hancock <jh...@gmail.com> wrote:

> Krishna,
>
> You should be able to take the command you are using to start the hadoop
> job (hadoop jar ..) and paste it into a text file.  Then make the file
> executable and call it as a shell script in a CRON job (crontab -e).  To be
> safe, use absolute paths to reference any files in the command.
>
> Or, I suppose what you crazy kids and your object oriented programming
> would do is use Quartz.
>
>
> -John
>
> On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
> chitreshdeshpande@gmail.com> wrote:
>
>> Hi Krishna,
>>
>> I dont know what do you mean by Hadoop daemon, but if you mean run when
>> all the other hadoop daemons like namenode, datanode etc are started, then
>> you can change start-all file in conf directory.
>>
>> Thanks and Regards,
>> Chitresh Deshpande
>>
>>
>> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com>wrote:
>>
>>> Hi al,
>>>
>>> I have a java application jar that converts some files and writes
>>> directly into hdfs.
>>>
>>> If I want to run the jar I need to run it using "hadoop jar <application
>>> jar>", so that it can access HDFS (that is running "java -jar <application
>>> jar> results in a HDFS error").
>>>
>>> Is it possible to run an jar as a hadoop daemon?
>>>
>>> Cheers,
>>>
>>> Krishna
>>>
>>
>>
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by John Hancock <jh...@gmail.com>.
Krishna,

You should be able to take the command you are using to start the hadoop
job (hadoop jar ..) and paste it into a text file.  Then make the file
executable and call it as a shell script in a CRON job (crontab -e).  To be
safe, use absolute paths to reference any files in the command.

Or, I suppose what you crazy kids and your object oriented programming
would do is use Quartz.


-John

On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
chitreshdeshpande@gmail.com> wrote:

> Hi Krishna,
>
> I dont know what do you mean by Hadoop daemon, but if you mean run when
> all the other hadoop daemons like namenode, datanode etc are started, then
> you can change start-all file in conf directory.
>
> Thanks and Regards,
> Chitresh Deshpande
>
>
> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com>wrote:
>
>> Hi al,
>>
>> I have a java application jar that converts some files and writes
>> directly into hdfs.
>>
>> If I want to run the jar I need to run it using "hadoop jar <application
>> jar>", so that it can access HDFS (that is running "java -jar <application
>> jar> results in a HDFS error").
>>
>> Is it possible to run an jar as a hadoop daemon?
>>
>> Cheers,
>>
>> Krishna
>>
>
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by John Hancock <jh...@gmail.com>.
Krishna,

You should be able to take the command you are using to start the hadoop
job (hadoop jar ..) and paste it into a text file.  Then make the file
executable and call it as a shell script in a CRON job (crontab -e).  To be
safe, use absolute paths to reference any files in the command.

Or, I suppose what you crazy kids and your object oriented programming
would do is use Quartz.


-John

On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
chitreshdeshpande@gmail.com> wrote:

> Hi Krishna,
>
> I dont know what do you mean by Hadoop daemon, but if you mean run when
> all the other hadoop daemons like namenode, datanode etc are started, then
> you can change start-all file in conf directory.
>
> Thanks and Regards,
> Chitresh Deshpande
>
>
> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com>wrote:
>
>> Hi al,
>>
>> I have a java application jar that converts some files and writes
>> directly into hdfs.
>>
>> If I want to run the jar I need to run it using "hadoop jar <application
>> jar>", so that it can access HDFS (that is running "java -jar <application
>> jar> results in a HDFS error").
>>
>> Is it possible to run an jar as a hadoop daemon?
>>
>> Cheers,
>>
>> Krishna
>>
>
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by John Hancock <jh...@gmail.com>.
Krishna,

You should be able to take the command you are using to start the hadoop
job (hadoop jar ..) and paste it into a text file.  Then make the file
executable and call it as a shell script in a CRON job (crontab -e).  To be
safe, use absolute paths to reference any files in the command.

Or, I suppose what you crazy kids and your object oriented programming
would do is use Quartz.


-John

On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
chitreshdeshpande@gmail.com> wrote:

> Hi Krishna,
>
> I dont know what do you mean by Hadoop daemon, but if you mean run when
> all the other hadoop daemons like namenode, datanode etc are started, then
> you can change start-all file in conf directory.
>
> Thanks and Regards,
> Chitresh Deshpande
>
>
> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com>wrote:
>
>> Hi al,
>>
>> I have a java application jar that converts some files and writes
>> directly into hdfs.
>>
>> If I want to run the jar I need to run it using "hadoop jar <application
>> jar>", so that it can access HDFS (that is running "java -jar <application
>> jar> results in a HDFS error").
>>
>> Is it possible to run an jar as a hadoop daemon?
>>
>> Cheers,
>>
>> Krishna
>>
>
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by John Hancock <jh...@gmail.com>.
Krishna,

You should be able to take the command you are using to start the hadoop
job (hadoop jar ..) and paste it into a text file.  Then make the file
executable and call it as a shell script in a CRON job (crontab -e).  To be
safe, use absolute paths to reference any files in the command.

Or, I suppose what you crazy kids and your object oriented programming
would do is use Quartz.


-John

On Sat, Jan 5, 2013 at 4:33 PM, Chitresh Deshpande <
chitreshdeshpande@gmail.com> wrote:

> Hi Krishna,
>
> I dont know what do you mean by Hadoop daemon, but if you mean run when
> all the other hadoop daemons like namenode, datanode etc are started, then
> you can change start-all file in conf directory.
>
> Thanks and Regards,
> Chitresh Deshpande
>
>
> On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com>wrote:
>
>> Hi al,
>>
>> I have a java application jar that converts some files and writes
>> directly into hdfs.
>>
>> If I want to run the jar I need to run it using "hadoop jar <application
>> jar>", so that it can access HDFS (that is running "java -jar <application
>> jar> results in a HDFS error").
>>
>> Is it possible to run an jar as a hadoop daemon?
>>
>> Cheers,
>>
>> Krishna
>>
>
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Chitresh Deshpande <ch...@gmail.com>.
Hi Krishna,

I dont know what do you mean by Hadoop daemon, but if you mean run when all
the other hadoop daemons like namenode, datanode etc are started, then you
can change start-all file in conf directory.

Thanks and Regards,
Chitresh Deshpande


On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com> wrote:

> Hi al,
>
> I have a java application jar that converts some files and writes directly
> into hdfs.
>
> If I want to run the jar I need to run it using "hadoop jar <application
> jar>", so that it can access HDFS (that is running "java -jar <application
> jar> results in a HDFS error").
>
> Is it possible to run an jar as a hadoop daemon?
>
> Cheers,
>
> Krishna
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Chitresh Deshpande <ch...@gmail.com>.
Hi Krishna,

I dont know what do you mean by Hadoop daemon, but if you mean run when all
the other hadoop daemons like namenode, datanode etc are started, then you
can change start-all file in conf directory.

Thanks and Regards,
Chitresh Deshpande


On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com> wrote:

> Hi al,
>
> I have a java application jar that converts some files and writes directly
> into hdfs.
>
> If I want to run the jar I need to run it using "hadoop jar <application
> jar>", so that it can access HDFS (that is running "java -jar <application
> jar> results in a HDFS error").
>
> Is it possible to run an jar as a hadoop daemon?
>
> Cheers,
>
> Krishna
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Robert Molina <rm...@hortonworks.com>.
Hi Krishna,
Do you simply want to schedule the job to run at  specific times?  If so, I
believe oozie maybe what you are looking for.

Regards,
Robert

On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com> wrote:

> Hi al,
>
> I have a java application jar that converts some files and writes directly
> into hdfs.
>
> If I want to run the jar I need to run it using "hadoop jar <application
> jar>", so that it can access HDFS (that is running "java -jar <application
> jar> results in a HDFS error").
>
> Is it possible to run an jar as a hadoop daemon?
>
> Cheers,
>
> Krishna
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Chitresh Deshpande <ch...@gmail.com>.
Hi Krishna,

I dont know what do you mean by Hadoop daemon, but if you mean run when all
the other hadoop daemons like namenode, datanode etc are started, then you
can change start-all file in conf directory.

Thanks and Regards,
Chitresh Deshpande


On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com> wrote:

> Hi al,
>
> I have a java application jar that converts some files and writes directly
> into hdfs.
>
> If I want to run the jar I need to run it using "hadoop jar <application
> jar>", so that it can access HDFS (that is running "java -jar <application
> jar> results in a HDFS error").
>
> Is it possible to run an jar as a hadoop daemon?
>
> Cheers,
>
> Krishna
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Harsh J <ha...@cloudera.com>.
Hi,

On Fri, Jan 4, 2013 at 8:10 PM, Krishna Rao <kr...@gmail.com> wrote:
> If I want to run the jar I need to run it using "hadoop jar <application jar>", so that it can access HDFS (that is running "java -jar <application jar> results in a HDFS error").

The latter is because running a Hadoop program requires Hadoop
dependencies and configs to be available in its runtime classpath.
This can be achieved by either creating a fat jar assembly, containing
config files and all required dependency jars; or you can run it as:
java -cp yourjar.jar:`hadoop classpath` YourMainClass <args> such that
the classpath is automatically setup for you.

--
Harsh J

Re: Possible to run an application jar as a hadoop daemon?

Posted by Harsh J <ha...@cloudera.com>.
Hi,

On Fri, Jan 4, 2013 at 8:10 PM, Krishna Rao <kr...@gmail.com> wrote:
> If I want to run the jar I need to run it using "hadoop jar <application jar>", so that it can access HDFS (that is running "java -jar <application jar> results in a HDFS error").

The latter is because running a Hadoop program requires Hadoop
dependencies and configs to be available in its runtime classpath.
This can be achieved by either creating a fat jar assembly, containing
config files and all required dependency jars; or you can run it as:
java -cp yourjar.jar:`hadoop classpath` YourMainClass <args> such that
the classpath is automatically setup for you.

--
Harsh J

Re: Possible to run an application jar as a hadoop daemon?

Posted by Robert Molina <rm...@hortonworks.com>.
Hi Krishna,
Do you simply want to schedule the job to run at  specific times?  If so, I
believe oozie maybe what you are looking for.

Regards,
Robert

On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com> wrote:

> Hi al,
>
> I have a java application jar that converts some files and writes directly
> into hdfs.
>
> If I want to run the jar I need to run it using "hadoop jar <application
> jar>", so that it can access HDFS (that is running "java -jar <application
> jar> results in a HDFS error").
>
> Is it possible to run an jar as a hadoop daemon?
>
> Cheers,
>
> Krishna
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Robert Molina <rm...@hortonworks.com>.
Hi Krishna,
Do you simply want to schedule the job to run at  specific times?  If so, I
believe oozie maybe what you are looking for.

Regards,
Robert

On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com> wrote:

> Hi al,
>
> I have a java application jar that converts some files and writes directly
> into hdfs.
>
> If I want to run the jar I need to run it using "hadoop jar <application
> jar>", so that it can access HDFS (that is running "java -jar <application
> jar> results in a HDFS error").
>
> Is it possible to run an jar as a hadoop daemon?
>
> Cheers,
>
> Krishna
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Harsh J <ha...@cloudera.com>.
Hi,

On Fri, Jan 4, 2013 at 8:10 PM, Krishna Rao <kr...@gmail.com> wrote:
> If I want to run the jar I need to run it using "hadoop jar <application jar>", so that it can access HDFS (that is running "java -jar <application jar> results in a HDFS error").

The latter is because running a Hadoop program requires Hadoop
dependencies and configs to be available in its runtime classpath.
This can be achieved by either creating a fat jar assembly, containing
config files and all required dependency jars; or you can run it as:
java -cp yourjar.jar:`hadoop classpath` YourMainClass <args> such that
the classpath is automatically setup for you.

--
Harsh J

Re: Possible to run an application jar as a hadoop daemon?

Posted by Robert Molina <rm...@hortonworks.com>.
Hi Krishna,
Do you simply want to schedule the job to run at  specific times?  If so, I
believe oozie maybe what you are looking for.

Regards,
Robert

On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com> wrote:

> Hi al,
>
> I have a java application jar that converts some files and writes directly
> into hdfs.
>
> If I want to run the jar I need to run it using "hadoop jar <application
> jar>", so that it can access HDFS (that is running "java -jar <application
> jar> results in a HDFS error").
>
> Is it possible to run an jar as a hadoop daemon?
>
> Cheers,
>
> Krishna
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Chitresh Deshpande <ch...@gmail.com>.
Hi Krishna,

I dont know what do you mean by Hadoop daemon, but if you mean run when all
the other hadoop daemons like namenode, datanode etc are started, then you
can change start-all file in conf directory.

Thanks and Regards,
Chitresh Deshpande


On Fri, Jan 4, 2013 at 6:40 AM, Krishna Rao <kr...@gmail.com> wrote:

> Hi al,
>
> I have a java application jar that converts some files and writes directly
> into hdfs.
>
> If I want to run the jar I need to run it using "hadoop jar <application
> jar>", so that it can access HDFS (that is running "java -jar <application
> jar> results in a HDFS error").
>
> Is it possible to run an jar as a hadoop daemon?
>
> Cheers,
>
> Krishna
>

Re: Possible to run an application jar as a hadoop daemon?

Posted by Harsh J <ha...@cloudera.com>.
Hi,

On Fri, Jan 4, 2013 at 8:10 PM, Krishna Rao <kr...@gmail.com> wrote:
> If I want to run the jar I need to run it using "hadoop jar <application jar>", so that it can access HDFS (that is running "java -jar <application jar> results in a HDFS error").

The latter is because running a Hadoop program requires Hadoop
dependencies and configs to be available in its runtime classpath.
This can be achieved by either creating a fat jar assembly, containing
config files and all required dependency jars; or you can run it as:
java -cp yourjar.jar:`hadoop classpath` YourMainClass <args> such that
the classpath is automatically setup for you.

--
Harsh J