You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Egor Pahomov (JIRA)" <ji...@apache.org> on 2014/11/14 16:04:34 UTC
[jira] [Created] (SPARK-4403) Elastic
allocation(spark.dynamicAllocation.enabled) results in task never being
execued.
Egor Pahomov created SPARK-4403:
-----------------------------------
Summary: Elastic allocation(spark.dynamicAllocation.enabled) results in task never being execued.
Key: SPARK-4403
URL: https://issues.apache.org/jira/browse/SPARK-4403
Project: Spark
Issue Type: Bug
Components: Spark Core, YARN
Affects Versions: 1.1.1
Reporter: Egor Pahomov
I execute ipython notebook + pyspark with spark.dynamicAllocation.enabled = true. Task never ends.
Code:
{code}
import sys
from random import random
from operator import add
partitions = 10
n = 100000 * partitions
def f(_):
x = random() * 2 - 1
y = random() * 2 - 1
return 1 if x ** 2 + y ** 2 < 1 else 0
count = sc.parallelize(xrange(1, n + 1), partitions).map(f).reduce(add)
print "Pi is roughly %f" % (4.0 * count / n)
{code}
{code}
pyspark \
--verbose \
--master yarn-client \
--conf spark.driver.port=$((RANDOM_PORT + 2)) \
--conf spark.broadcast.port=$((RANDOM_PORT + 3)) \
--conf spark.replClassServer.port=$((RANDOM_PORT + 4)) \
--conf spark.blockManager.port=$((RANDOM_PORT + 5)) \
--conf spark.executor.port=$((RANDOM_PORT + 6)) \
--conf spark.fileserver.port=$((RANDOM_PORT + 7)) \
--conf spark.shuffle.service.enabled=true \
--conf spark.dynamicAllocation.enabled=true \
--conf spark.dynamicAllocation.minExecutors=1 \
--conf spark.dynamicAllocation.maxExecutors=10 \
--conf spark.ui.port=$SPARK_UI_PORT
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org