You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@oozie.apache.org by Gäde, Sebastian <s1...@hft-leipzig.de> on 2014/06/01 14:06:02 UTC

Oozie job failing due to Unknown hadoop job

Hi,

although I've been successfully submitting jobs to Oozie new jobs are not working anymore. I was playing with the configuration, but have undone all changes I'm aware of. Whenever I submit a job now, the map-reduce action fails to 
"Unknown hadoop job [job_local293236882_0001] associated with action [0000000-140601133941332-oozie-hdus-W@mr-node]."
I'm sure there should be no 'local' in the jobID, however, I don't know how to avoid running the job locally.

Probably this is due to my setting for "oozie.service.HadoopAccessorService.hadoop.configurations" which is set to "*=hadoop-conf" with hadoop-conf being a symlink to my actual hadoop-conf-directory?

This is my mr-action configuration from the workflow.xml, Namenode und ResourceManager are running:
<map-reduce xmlns="uri:oozie:workflow:0.2">
  <job-tracker>hd-slave-172:8032</job-tracker>
  <name-node>hdfs://hd-slave-172:8020</name-node>
  <prepare>
    <delete path="/output" />
  </prepare>
  <configuration>
    <property>
      <name>mapreduce.map.memory.mb</name>
      <value>512</value>
    </property>
    <property>
      <name>mapreduce.reduce.memory.mb</name>
      <value>512</value>
    </property>
    <property>
      <name>mapreduce.job.queue.name</name>
      <value>default</value>
    </property>
    <property>
      <name>mapreduce.job.map.class</name>
      <value>de.sebastian.mr.CdrReaderAPIv2Mapper</value>
    </property>
    <property>
      <name>mapreduce.job.reduce.class</name>
      <value>de.sebastian.mr.CdrReaderAPIv2Reducer</value>
    </property>
    <property>
      <name>mapreduce.input.fileinputformat.inputdir</name>
      <value>/input</value>
    </property>
    <property>
      <name>mapreduce.output.fileoutputformat.outputdir</name>
      <value>/output</value>
    </property>
    <property>
      <name>mapreduce.job.inputformat.class</name>
      <value>org.apache.hadoop.mapreduce.lib.input.TextInputFormat</value>
    </property>
    <property>
      <name>mapreduce.map.output.key.class</name>
      <value>org.apache.hadoop.io.Text</value>
    </property>
    <property>
      <name>mapreduce.map.output.value.class</name>
      <value>de.sebastian.mr.CdrCounter</value>
    </property>
    <property>
      <name>mapreduce.job.output.key.class</name>
      <value>org.apache.hadoop.io.Text</value>
    </property>
    <property>
      <name>mapreduce.job.output.value.class</name>
      <value>de.sebastian.mr.CdrCounter</value>
    </property>
    <property>
      <name>mapreduce.job.combine.class</name>
      <value>de.sebastian.mr.CdrReaderAPIv2Reducer</value>
    </property>
    <property>
      <name>mapred.reducer.new-api</name>
      <value>true</value>
    </property>
    <property>
      <name>mapred.mapper.new-api</name>
      <value>true</value>
    </property>
  </configuration>
  <file>listen/bdlliste-ng.csv,listen/dialed_digits.csv</file>
</map-reduce>

Thanks for your help!
Cheers
Seb.