You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@oozie.apache.org by Nitin Kumar <nk...@gmail.com> on 2016/02/08 05:11:24 UTC

Sending parameters to coordinators jobs at run-time

Hi!

I have some coordinator jobs (primarily hive scripts) that are running on a
daily basis. I need to pass parameters to the hive script such as 'day of
the year' and other variables that script should read from a configurable
source.

I know this can be done by changing the workflow.xml and deploying it to
HDFS before the action is instantiated every day. But its kind of a
overhead because we will have several such workflows to maintain.

Is there a better strategy to pass arguments to coordinator jobs at
run-time? Maybe a sequence of other actions that need to be performed in
the workflow so that the hive script gets the appropriate parameters?

Any suggestions are welcome.

Regards,
Nitin

Re: Sending parameters to coordinators jobs at run-time

Posted by Nitin Kumar <nk...@gmail.com>.
That's a good approach! I'd right java property files and have my shell
node cat them.

Thanks Vincent!

On Tue, Feb 9, 2016 at 9:44 PM, Vincent Peplinski <vi...@gmail.com>
wrote:

> Have you considered running a script using the shell action to return the
> data you want?
> See http://oozie.apache.org/docs/4.0.0/DG_ShellActionExtension.html
>
> On Sun, Feb 7, 2016 at 11:11 PM, Nitin Kumar <nk...@gmail.com>
> wrote:
>
> > Hi!
> >
> > I have some coordinator jobs (primarily hive scripts) that are running
> on a
> > daily basis. I need to pass parameters to the hive script such as 'day of
> > the year' and other variables that script should read from a configurable
> > source.
> >
> > I know this can be done by changing the workflow.xml and deploying it to
> > HDFS before the action is instantiated every day. But its kind of a
> > overhead because we will have several such workflows to maintain.
> >
> > Is there a better strategy to pass arguments to coordinator jobs at
> > run-time? Maybe a sequence of other actions that need to be performed in
> > the workflow so that the hive script gets the appropriate parameters?
> >
> > Any suggestions are welcome.
> >
> > Regards,
> > Nitin
> >
>

Re: Sending parameters to coordinators jobs at run-time

Posted by Vincent Peplinski <vi...@gmail.com>.
Have you considered running a script using the shell action to return the
data you want?
See http://oozie.apache.org/docs/4.0.0/DG_ShellActionExtension.html

On Sun, Feb 7, 2016 at 11:11 PM, Nitin Kumar <nk...@gmail.com>
wrote:

> Hi!
>
> I have some coordinator jobs (primarily hive scripts) that are running on a
> daily basis. I need to pass parameters to the hive script such as 'day of
> the year' and other variables that script should read from a configurable
> source.
>
> I know this can be done by changing the workflow.xml and deploying it to
> HDFS before the action is instantiated every day. But its kind of a
> overhead because we will have several such workflows to maintain.
>
> Is there a better strategy to pass arguments to coordinator jobs at
> run-time? Maybe a sequence of other actions that need to be performed in
> the workflow so that the hive script gets the appropriate parameters?
>
> Any suggestions are welcome.
>
> Regards,
> Nitin
>