You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (JIRA)" <ji...@apache.org> on 2016/02/16 20:20:18 UTC
[jira] [Resolved] (SPARK-11701) YARN - dynamic allocation and
speculation active task accounting wrong
[ https://issues.apache.org/jira/browse/SPARK-11701?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Thomas Graves resolved SPARK-11701.
-----------------------------------
Resolution: Duplicate
> YARN - dynamic allocation and speculation active task accounting wrong
> ----------------------------------------------------------------------
>
> Key: SPARK-11701
> URL: https://issues.apache.org/jira/browse/SPARK-11701
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.5.1
> Reporter: Thomas Graves
> Assignee: Thomas Graves
> Priority: Critical
>
> I am using dynamic container allocation and speculation and am seeing issues with the active task accounting. The Executor UI still shows active tasks on the an executor but the job/stage is all completed. I think its also affecting the dynamic allocation being able to release containers because it thinks there are still tasks.
> Its easily reproduce by using spark-shell, turn on dynamic allocation, then run just a wordcount on decent sized file and save back to hdfs and set the speculation parameters low:
> spark.dynamicAllocation.enabled true
> spark.shuffle.service.enabled true
> spark.dynamicAllocation.maxExecutors 10
> spark.dynamicAllocation.minExecutors 2
> spark.dynamicAllocation.initialExecutors 10
> spark.dynamicAllocation.executorIdleTimeout 40s
> $SPARK_HOME/bin/spark-shell --conf spark.speculation=true --conf spark.speculation.multiplier=0.2 --conf spark.speculation.quantile=0.1 --master yarn --deploy-mode client --executor-memory 4g --driver-memory 4g
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org