You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@oozie.apache.org by Neosix <ne...@gmail.com> on 2015/04/01 09:58:37 UTC

Fwd: Error on Pig script

hi,
i use Openstack (RDO default installation) with swift and sahara.
Every time i run a job, a test job, i receive JA018 error with Error
message: "Main class [org.apache.oozie.action.hadoop.PigMain], exit code
[2]".

i set log to ALL level, but i not found any clear string.

i've tried to set parameters
"oozie.libpath=hdfs://cluster-2-4-1-master-node-2-4-1-001:9000/user/hadoop/share/lib"
to address right sharelib path, but i received same error, with and without.

i see that "map task" has been completed and the "reduce task" was not
started.

the oozie version is 4.0.1, and cluster provisioning was made by sahara
with ubuntu and centos6.5 vanilla image.

how can i resolve?

follow workflow:

<?xml version="1.0" ?>
<workflow-app name="job-wf" xmlns="uri:oozie:workflow:0.2">
  <start to="job-node"/>
  <action name="job-node">
    <pig>
      <job-tracker>${jobTracker}</job-tracker>
      <name-node>${nameNode}</name-node>
      <configuration>
        <property>
          <name>fs.swift.service.sahara.password</name>
          <value>hi</value>
        </property>
        <property>
          <name>fs.swift.service.sahara.username</name>
          <value>hi</value>
        </property>
      </configuration>
      <script>test</script>
      <param>INPUT=swift://test.sahara/input_test_sahara.tar</param>
      <param>OUTPUT=swift://test.sahara/output</param>
      <param>oozie.libpath=${nameNode}/user/hadoop/share/lib</param>
    </pig>
    <ok to="end"/>
    <error to="fail"/>
  </action>
  <kill name="fail">
    <message>Workflow failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]</message>
  </kill>

and configuration

<configuration>
  <property>
    <name>user.name</name>
    <value>hadoop</value>
  </property>
  <property>
    <name>oozie.use.system.libpath</name>
    <value>true</value>
  </property>
  <property>
    <name>mapreduce.job.user.name</name>
    <value>hadoop</value>
  </property>
  <property>
    <name>nameNode</name>
    <value>hdfs://cluster-2-4-1-master-node-2-4-1-001:9000</value>
  </property>
  <property>
    <name>jobTracker</name>
    <value>http://203.0.113.127:8032</value>
  </property>
  <property>
    <name>oozie.wf.application.path</name>

<value>hdfs://cluster-2-4-1-master-node-2-4-1-001:9000/user/hadoop/test-job/06e4b4fa-94d8-43b8-b660-7f06feaf17d9/workflow.xml</value>
  </property>
</configuration>


Thanx

-- 
Neosix



-- 
Neosix