You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:00:36 UTC

[jira] [Updated] (SPARK-21792) Document Spark Streaming Dynamic Allocation Configuration

     [ https://issues.apache.org/jira/browse/SPARK-21792?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-21792:
---------------------------------
    Labels: bulk-closed  (was: )

> Document Spark Streaming Dynamic Allocation Configuration
> ---------------------------------------------------------
>
>                 Key: SPARK-21792
>                 URL: https://issues.apache.org/jira/browse/SPARK-21792
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation, DStreams
>    Affects Versions: 2.2.0
>            Reporter: Karthik Palaniappan
>            Priority: Minor
>              Labels: bulk-closed
>
> Support for dynamic allocation was added in https://issues.apache.org/jira/browse/SPARK-12133, and it included new config parameters in https://github.com/apache/spark/blob/branch-2.0/streaming/src/main/scala/org/apache/spark/streaming/scheduler/ExecutorAllocationManager.scala#L189.
> These new parameters should be added to http://spark.apache.org/docs/latest/configuration.html.
> Out of curiosity: why can't Core's dynamic allocation and Streaming's dynamic allocation both be enabled? It seems like Streaming's dynamic allocation properties should always be used for streaming applications, and Core's should always be ignored. (Either way this should be highlighted in the documentation).



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org