You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Rachana Srivastava <Ra...@markmonitor.com> on 2016/06/22 15:43:36 UTC

Unable to increase Active Tasks of a Spark Streaming Process in Yarn

Hello all,

I am running a spark streaming process where I got a batch of 6000 events.  But when I look at executors only one active task is running.  I tried dynamic allocation and as well as setting number of executors etc.  Even if I have 15 executors only one active task is running at a time.  Can any one please guide me what am I doing wrong here.

[cid:image003.png@01D1CC62.29CA9A60]

[cid:image004.png@01D1CC62.29CA9A60]