You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "siva venkat gogineni (JIRA)" <ji...@apache.org> on 2014/06/16 15:26:02 UTC

[jira] [Created] (SPARK-2154) Worker goes down.

siva venkat gogineni created SPARK-2154:
-------------------------------------------

             Summary: Worker goes down.
                 Key: SPARK-2154
                 URL: https://issues.apache.org/jira/browse/SPARK-2154
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.0.0, 0.9.0, 0.8.1
         Environment: Spark on cluster of three nodes on Ubuntu 12.04.4 LTS
            Reporter: siva venkat gogineni


Worker dies when i try to submit drivers more than the allocated cores. When I submit 9 drivers with one core for each driver on a cluster having 8 cores all together the worker dies as soon as i submit the 9 the driver. It works fine until it reaches 8 cores, As soon as i submit 9th driver the driver status remains "Submitted" and the worker crashes. I understand that we cannot run  drivers more than the allocated cores but the problem here is instead of the 9th driver being in queue it is being executed and as a result it is crashing the worker. Let me know if there is a way to get around this issue or is it being fixed in the upcoming version?

Cluster Details:
Spark 1.00
2 nodes with 4 cores each.



--
This message was sent by Atlassian JIRA
(v6.2#6252)