You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@oozie.apache.org by Suresh S <sa...@gmail.com> on 2011/12/07 05:24:24 UTC

unable to run oozie workflow

Hi
I am able to run example workflows in oozie,
I want to run my own job in oozie, which is to deploy war files using
map-reduce. This is runnning fine as my hadoop task. It requires the
argument of war file hdfs path and output path.
so I want to run this same job as oozie workflow.

So I used the same format of oozie examples .. kept the jar files in lib
and examples-lib directory,
when I submit the job, its not running the map task at all ... showing the
state of oozie job as running. I dont see any errors in the oozie logs as
well as hadoop logs. In hadooop it is showing the map task status as
pending only. The oozie job is running infinitely.

my job.properties file:
nameNode=hdfs://HNVETA23089.local:54310
jobTracker=HNVETA23089.local:54311
queueName=default
feedRoot=feedfactory
oozie.libpath=/user/${user.name}/${feedRoot}/apps/examples-lib
oozie.wf.application.path=${nameNode}/user/${user.name
}/${feedRoot}/apps/launch

used two different workflow but unable to run successfully:

workflow 1:

<workflow-app xmlns="uri:oozie:workflow:0.2" name="launch-wf">
    <start to="java-node"/>
    <action name="java-node">
        <java>
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>

<main-class>com.veta.hadoop.feedfactory.JettyLauncher</main-class>
            <arg>/user/${wf:user()}</arg>
            <arg>output</arg>
        </java>
        <ok to="end"/>
        <error to="fail"/>
    </action>
    <kill name="fail">
        <message>Java failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>

 -------------------------
workflow 2

<workflow-app xmlns="uri:oozie:workflow:0.2" name="launch-wf">
    <start to="JT-node"/>
    <action name="JT-node">
        <map-reduce>
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <prepare>
                <delete path="${nameNode}/user/${wf:user()}/${outputDir}"/>
            </prepare>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
                <property>
                    <name>mapred.mapper.new-api</name>
                    <value>true</value>
                </property>
                <property>
                    <name>mapred.reducer.new-api</name>
                    <value>true</value>
                </property>
                <property>
                    <name>mapred.input.format.class</name>

<value>com.veta.hadoop.feedfactory.io.WarFileFormat</value>
                </property>
                <property>
                    <name>mapred.output.key.class</name>
                    <value>org.apache.hadoop.io.Text</value>
                </property>
                <property>
                    <name>mapred.output.value.class</name>
                    <value>org.apache.hadoop.io.Text</value>
                </property>
                <property>
                    <name>mapred.mapper.class</name>

<value>com.veta.hadoop.feedfactory.engine.WebEngine</value>
                </property>
                <property>
                    <name>mapred.map.tasks</name>
                    <value>1</value>
                </property>
                <property>
                    <name>mapred.input.dir</name>
                    <value>/user/${wf:user()}/War</value>
                </property>
                <property>
                    <name>mapred.output.dir</name>
                    <value>/user/${wf:user()}/${outputDir}</value>
                </property>
            </configuration>
        </map-reduce>
        <ok to="end"/>
        <error to="fail"/>
    </action>
    <kill name="fail">
        <message>Map/Reduce failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>


Thanks.
Suresh

Re: unable to run oozie workflow

Posted by Alejandro Abdelnur <tu...@cloudera.com>.
Suresh,

Does you cluster have enough task slots available?

Thanks.

Alejandro

On Tue, Dec 6, 2011 at 8:24 PM, Suresh S <sa...@gmail.com> wrote:

> Hi
> I am able to run example workflows in oozie,
> I want to run my own job in oozie, which is to deploy war files using
> map-reduce. This is runnning fine as my hadoop task. It requires the
> argument of war file hdfs path and output path.
> so I want to run this same job as oozie workflow.
>
> So I used the same format of oozie examples .. kept the jar files in lib
> and examples-lib directory,
> when I submit the job, its not running the map task at all ... showing the
> state of oozie job as running. I dont see any errors in the oozie logs as
> well as hadoop logs. In hadooop it is showing the map task status as
> pending only. The oozie job is running infinitely.
>
> my job.properties file:
> nameNode=hdfs://HNVETA23089.local:54310
> jobTracker=HNVETA23089.local:54311
> queueName=default
> feedRoot=feedfactory
> oozie.libpath=/user/${user.name}/${feedRoot}/apps/examples-lib
> oozie.wf.application.path=${nameNode}/user/${user.name
> }/${feedRoot}/apps/launch
>
> used two different workflow but unable to run successfully:
>
> workflow 1:
>
> <workflow-app xmlns="uri:oozie:workflow:0.2" name="launch-wf">
>    <start to="java-node"/>
>    <action name="java-node">
>        <java>
>            <job-tracker>${jobTracker}</job-tracker>
>            <name-node>${nameNode}</name-node>
>            <configuration>
>                <property>
>                    <name>mapred.job.queue.name</name>
>                    <value>${queueName}</value>
>                </property>
>            </configuration>
>
> <main-class>com.veta.hadoop.feedfactory.JettyLauncher</main-class>
>            <arg>/user/${wf:user()}</arg>
>            <arg>output</arg>
>        </java>
>        <ok to="end"/>
>        <error to="fail"/>
>    </action>
>    <kill name="fail">
>        <message>Java failed, error
> message[${wf:errorMessage(wf:lastErrorNode())}]</message>
>    </kill>
>    <end name="end"/>
> </workflow-app>
>
>  -------------------------
> workflow 2
>
> <workflow-app xmlns="uri:oozie:workflow:0.2" name="launch-wf">
>    <start to="JT-node"/>
>    <action name="JT-node">
>        <map-reduce>
>            <job-tracker>${jobTracker}</job-tracker>
>            <name-node>${nameNode}</name-node>
>            <prepare>
>                <delete path="${nameNode}/user/${wf:user()}/${outputDir}"/>
>            </prepare>
>            <configuration>
>                <property>
>                    <name>mapred.job.queue.name</name>
>                    <value>${queueName}</value>
>                </property>
>                <property>
>                    <name>mapred.mapper.new-api</name>
>                    <value>true</value>
>                </property>
>                <property>
>                    <name>mapred.reducer.new-api</name>
>                    <value>true</value>
>                </property>
>                <property>
>                    <name>mapred.input.format.class</name>
>
> <value>com.veta.hadoop.feedfactory.io.WarFileFormat</value>
>                </property>
>                <property>
>                    <name>mapred.output.key.class</name>
>                    <value>org.apache.hadoop.io.Text</value>
>                </property>
>                <property>
>                    <name>mapred.output.value.class</name>
>                    <value>org.apache.hadoop.io.Text</value>
>                </property>
>                <property>
>                    <name>mapred.mapper.class</name>
>
> <value>com.veta.hadoop.feedfactory.engine.WebEngine</value>
>                </property>
>                <property>
>                    <name>mapred.map.tasks</name>
>                    <value>1</value>
>                </property>
>                <property>
>                    <name>mapred.input.dir</name>
>                    <value>/user/${wf:user()}/War</value>
>                </property>
>                <property>
>                    <name>mapred.output.dir</name>
>                    <value>/user/${wf:user()}/${outputDir}</value>
>                </property>
>            </configuration>
>        </map-reduce>
>        <ok to="end"/>
>        <error to="fail"/>
>    </action>
>    <kill name="fail">
>        <message>Map/Reduce failed, error
> message[${wf:errorMessage(wf:lastErrorNode())}]</message>
>    </kill>
>    <end name="end"/>
> </workflow-app>
>
>
> Thanks.
> Suresh
>