You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Neerja Khattar (JIRA)" <ji...@apache.org> on 2016/12/06 21:50:58 UTC

[jira] [Created] (SPARK-18750) spark should be able to control the number of executor and should not throw stack overslow

Neerja Khattar created SPARK-18750:
--------------------------------------

             Summary: spark should be able to control the number of executor and should not throw stack overslow
                 Key: SPARK-18750
                 URL: https://issues.apache.org/jira/browse/SPARK-18750
             Project: Spark
          Issue Type: Bug
            Reporter: Neerja Khattar


16/11/29 15:47:47 INFO impl.ContainerManagementProtocolProxy: Opening proxy : bdtcstr61n5.svr.us.jpmchase.net:8041 
16/11/29 15:47:47 INFO impl.ContainerManagementProtocolProxy: Opening proxy : bdtcstr61n8.svr.us.jpmchase.net:8041 
16/11/29 15:47:47 INFO impl.ContainerManagementProtocolProxy: Opening proxy : bdtcstr61n2.svr.us.jpmchase.net:8041 
16/11/29 15:47:47 INFO yarn.YarnAllocator: Driver requested a total number of 32770 executor(s). 
16/11/29 15:47:47 INFO yarn.YarnAllocator: Will request 24576 executor containers, each with 1 cores and 6758 MB memory including 614 MB overhead 
16/11/29 15:49:11 INFO yarn.YarnAllocator: Driver requested a total number of 52902 executor(s). 
16/11/29 15:49:11 WARN yarn.ApplicationMaster: Reporter thread fails 1 time(s) in a row. 
java.lang.StackOverflowError 
at scala.collection.immutable.HashMap.$plus(HashMap.scala:57) 
at scala.collection.immutable.HashMap.$plus(HashMap.scala:36) 
at scala.collection.mutable.MapBuilder.$plus$eq(MapBuilder.scala:28) 

If you notice in the error above, YARN is trying to request 24576 executor containers, whereas the available cores are 1719. The Driver is requesting for 52902 executor(s), which too high. 

This exception should be fixed




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org