You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Crawford (JIRA)" <ji...@apache.org> on 2017/09/30 11:51:01 UTC

[jira] [Commented] (SPARK-5243) Spark will hang if (driver memory + executor memory) exceeds limit on a 1-worker cluster

    [ https://issues.apache.org/jira/browse/SPARK-5243?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16187051#comment-16187051 ] 

Michael Crawford commented on SPARK-5243:
-----------------------------------------

Spent an entire night stuck with this bug.

Starting a simple emr cluster with 1 master an 1 worker node then using deploy-mode cluster will cause this problem.

If you are simply testing and trying to reduce costs you can simply startup with only a master node and no workers (i.e. instances=1) and there is no issue.



> Spark will hang if (driver memory + executor memory) exceeds limit on a 1-worker cluster
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-5243
>                 URL: https://issues.apache.org/jira/browse/SPARK-5243
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>    Affects Versions: 1.2.0
>         Environment: centos, others should be similar
>            Reporter: yuhao yang
>            Priority: Minor
>
> Spark will hang if calling spark-submit under the conditions:
> 1. the cluster has only one worker.
> 2. driver memory + executor memory > worker memory
> 3. deploy-mode = cluster
> This usually happens during development for beginners.
> There should be some exit mechanism or at least a warning message in the output of the spark-submit.
> I would like to know your opinions about if a fix is needed (is this by design?) and better fix options.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org