You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/04/30 05:32:13 UTC

[jira] [Commented] (SPARK-1099) Allow inferring number of cores with local[*]

    [ https://issues.apache.org/jira/browse/SPARK-1099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15265117#comment-15265117 ] 

Apache Spark commented on SPARK-1099:
-------------------------------------

User 'qqsun8819' has created a pull request for this issue:
https://github.com/apache/spark/pull/110

> Allow inferring number of cores with local[*]
> ---------------------------------------------
>
>                 Key: SPARK-1099
>                 URL: https://issues.apache.org/jira/browse/SPARK-1099
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>            Reporter: Aaron Davidson
>            Assignee: Aaron Davidson
>            Priority: Minor
>             Fix For: 1.0.0
>
>
> It seems reasonable that the default number of cores used by spark's local mode (when no value is specified) is drawn from the spark.cores.max configuration parameter (which, conveniently, is now settable as a command-line option in spark-shell).
> For the sake of consistency, it's probable that this change would also entail making the default number of cores when spark.cores.max is NOT specified to be as many logical cores are on the machine (which is what standalone mode does). This too seems reasonable, as Spark is inherently a distributed system and I think it's expected that it should use multiple cores by default. However, it is a behavioral change, and thus requires caution.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org