You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by kant kodali <ka...@gmail.com> on 2018/10/09 18:50:37 UTC

Does spark.streaming.concurrentJobs still exist?

Does spark.streaming.concurrentJobs still exist?

spark.streaming.concurrentJobs (default: 1) is the number of concurrent
jobs, i.e. threads in streaming-job-executor thread pool
<https://github.com/jaceklaskowski/spark-streaming-notebook/blob/master/spark-streaming-jobscheduler.adoc#streaming-job-executor>
.

Also how is this definition different from executor-cores?