You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Xuefu Zhang (JIRA)" <ji...@apache.org> on 2017/06/01 03:17:04 UTC

[jira] [Created] (HIVE-16799) Control the max number of task for a stage in a spark job

Xuefu Zhang created HIVE-16799:
----------------------------------

             Summary: Control the max number of task for a stage in a spark job
                 Key: HIVE-16799
                 URL: https://issues.apache.org/jira/browse/HIVE-16799
             Project: Hive
          Issue Type: Improvement
            Reporter: Xuefu Zhang
            Assignee: Xuefu Zhang


HIVE-16552 gives admin an option to control the maximum number of tasks a Spark job may have. However, this may not be sufficient as this tends to penalize jobs that have many stages while favoring jobs that has fewer stages. Ideally, we should also limit the number of tasks in a stage, which is closer to the maximum number of mappers or reducers in a MR job.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)