You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jorge Rodriguez <jo...@bloomreach.com> on 2016/02/20 20:24:35 UTC

Spark Streaming: Is it possible to schedule multiple active batches?

Is it possible to have the scheduler schedule the next batch even if the
previous batch has not completed yet?  I'd like to schedule up to 3 batches
at the same time if this is possible.

Re: Spark Streaming: Is it possible to schedule multiple active batches?

Posted by Neelesh <ne...@gmail.com>.
spark.streaming.concurrentJobs may help. Experimental according to TD from
an older thread here
http://stackoverflow.com/questions/23528006/how-jobs-are-assigned-to-executors-in-spark-streaming

On Sat, Feb 20, 2016 at 11:24 AM, Jorge Rodriguez <jo...@bloomreach.com>
wrote:

>
> Is it possible to have the scheduler schedule the next batch even if the
> previous batch has not completed yet?  I'd like to schedule up to 3 batches
> at the same time if this is possible.
>