You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Aravindh <ma...@aravindh.io> on 2016/11/08 13:59:48 UTC

Spark streaming uses lesser number of executors

Hi, I am using spark streaming process some events. It is deployed in
standalone mode with 1 master and 3 workers. I have set number of cores per
executor to 4 and total num of executors to 24. This means totally 6
executors will be spawned. I have set spread-out to true. So each worker
machine get 2 executors. My batch interval is 1 second. While running what I
observe from event timeline is that only 3 of the executors are being used.
The other 3 are not being used. As far as I know, there is no parameter in
spark standalone mode to specify the number of executors. How do I make
spark to use all the available executors? 



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-streaming-uses-lesser-number-of-executors-tp28042.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org