You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Praveen Sripati <pr...@gmail.com> on 2014/11/27 12:47:24 UTC

Standalone scheduling - document inconsistent

Hi,

There is a bit of inconsistent in the document. Which is the correct
statement?

`http://spark.apache.org/docs/latest/spark-standalone.html` says

The standalone cluster mode currently only supports a simple FIFO scheduler
across applications.

while `http://spark.apache.org/docs/latest/job-scheduling.html` says

Starting in Spark 0.8, it is also possible to configure fair sharing
between jobs.

Thanks,
Praveen

Re: Standalone scheduling - document inconsistent

Posted by Reynold Xin <rx...@databricks.com>.
The 1st was referring to different Spark applications connecting to the
standalone cluster manager, and the 2nd one was referring to within a
single Spark application, the jobs can be scheduled using a fair scheduler.


On Thu, Nov 27, 2014 at 3:47 AM, Praveen Sripati <pr...@gmail.com>
wrote:

> Hi,
>
> There is a bit of inconsistent in the document. Which is the correct
> statement?
>
> `http://spark.apache.org/docs/latest/spark-standalone.html` says
>
> The standalone cluster mode currently only supports a simple FIFO scheduler
> across applications.
>
> while `http://spark.apache.org/docs/latest/job-scheduling.html` says
>
> Starting in Spark 0.8, it is also possible to configure fair sharing
> between jobs.
>
> Thanks,
> Praveen
>