You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ofer Eliassaf (JIRA)" <ji...@apache.org> on 2016/09/08 07:13:21 UTC

[jira] [Comment Edited] (SPARK-17444) spark memory allocation makes workers non responsive

    [ https://issues.apache.org/jira/browse/SPARK-17444?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15472931#comment-15472931 ] 

Ofer Eliassaf edited comment on SPARK-17444 at 9/8/16 7:13 AM:
---------------------------------------------------------------

sorry - i can't reproduce properly.
closing.

The problem was that i used different spar-env.sh versions on different machines which led to undefined bahavior


was (Author: ofer):
sorry - i can't reproduce properly.
closing.

> spark memory allocation makes workers non responsive
> ----------------------------------------------------
>
>                 Key: SPARK-17444
>                 URL: https://issues.apache.org/jira/browse/SPARK-17444
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.0.0
>         Environment: spark standalone
>            Reporter: Ofer Eliassaf
>            Priority: Critical
>
> I am running a cluster of 3 slaves and 2 masters with spark standalone.
> total of 12 cores  (4 in each machine)
> memory allocated to executors and workers are 4.5GB, and the machine has total of 8GB.
> steps to reproduce:
> open pyspark and point to the masters
> run the following command multiple times:
> sc.parallelize(range(1,50000000), 12).count()
> after few runs the python will stop respond.
> then exit the python shell.
> The critical issue that after this happens the cluster is not useful any more:
> There is no way to submit application or running another commands on the cluster etc.
> Hope this helps!



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org