You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pig.apache.org by "liyunzhang_intel (JIRA)" <ji...@apache.org> on 2016/10/26 07:56:58 UTC

[jira] [Created] (PIG-5051) Initialize PigContants.TASK_INDEX in spark mode correctly

liyunzhang_intel created PIG-5051:
-------------------------------------

             Summary: Initialize PigContants.TASK_INDEX in spark mode correctly
                 Key: PIG-5051
                 URL: https://issues.apache.org/jira/browse/PIG-5051
             Project: Pig
          Issue Type: Sub-task
            Reporter: liyunzhang_intel


in MR, we initialize PigContants.TASK_INDEX in  org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce.Reduce#setup 
{code}
protected void setup(Context context) throws IOException, InterruptedException {
   ...
    context.getConfiguration().set(PigConstants.TASK_INDEX, Integer.toString(context.getTaskAttemptID().getTaskID().getId()));
...
}
{code}
But spark does not provide funtion like PigGenericMapReduce.Reduce#setup to initialize PigContants.TASK_INDEX when job starts. We need find a solution to initialize PigContants.TASK_INDEX correctly.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)