You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hong Shen (JIRA)" <ji...@apache.org> on 2014/12/16 10:34:13 UTC

[jira] [Updated] (SPARK-4341) Spark need to set num-executors automatically

     [ https://issues.apache.org/jira/browse/SPARK-4341?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hong Shen updated SPARK-4341:
-----------------------------
    Attachment: SPARK-4341.diff

> Spark need to set num-executors automatically
> ---------------------------------------------
>
>                 Key: SPARK-4341
>                 URL: https://issues.apache.org/jira/browse/SPARK-4341
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.1.0
>            Reporter: Hong Shen
>         Attachments: SPARK-4341.diff
>
>
> The mapreduce job can set maptask automaticlly, but in spark, we have to set num-executors, executor memory and cores. It's difficult for users to set these args, especially for the users want to use spark sql. So when user havn't set num-executors,  spark should set num-executors automatically accroding to the input partitions.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org