You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mingyu Kim <mk...@palantir.com> on 2013/11/20 12:39:39 UTC

Job cancellation

Hi all,

Cancellation seems to be supported at application level. In other words, you
can call stop() on your instance of SparkContext in order to stop the
computation associated with the SparkContext. Is there any way to cancel a
job? (To be clear, job is "a parallel computation consisting of multiple
tasks that gets spawned in response to a Spark action² as defined on the
Spark website.) The current RDD API doesn¹t seem to provide this
functionality, but I¹m wondering if there is any way to do anything similar.
I¹d like to be able to cancel a long-running job that is found to be
unnecessary without shutting down the SparkContext.

If there is no way to simulate the cancellation currently, is there any plan
to support this functionality? Or, is this just not part of the design or
desired uses of SparkContext?

Thanks!

Mingyu



Re: Job cancellation

Posted by Mingyu Kim <mk...@palantir.com>.
Awesome! That¹s exactly what I needed. Is there any estimated timeline for
0.8.1 release?

Mingyu

From:  Mark Hamstra <ma...@clearstorydata.com>
Reply-To:  "user@spark.incubator.apache.org"
<us...@spark.incubator.apache.org>
Date:  Wednesday, November 20, 2013 at 4:06 AM
To:  user <us...@spark.incubator.apache.org>
Subject:  Re: Job cancellation

Job cancellation has been in both 0.8.1 SNAPSHOT and 0.9.0 SNAPSHOT for
awhile now: PR29 <https://github.com/apache/incubator-spark/pull/29> , PR74
<https://github.com/apache/incubator-spark/pull/74> .
Modification/improvement of job cancellation is part of the open pull
request PR190 <https://github.com/apache/incubator-spark/pull/190> .


On Wed, Nov 20, 2013 at 3:39 AM, Mingyu Kim <mk...@palantir.com> wrote:
> Hi all,
> 
> Cancellation seems to be supported at application level. In other words, you
> can call stop() on your instance of SparkContext in order to stop the
> computation associated with the SparkContext. Is there any way to cancel a
> job? (To be clear, job is "a parallel computation consisting of multiple tasks
> that gets spawned in response to a Spark action² as defined on the Spark
> website.) The current RDD API doesn¹t seem to provide this functionality, but
> I¹m wondering if there is any way to do anything similar. I¹d like to be able
> to cancel a long-running job that is found to be unnecessary without shutting
> down the SparkContext.
> 
> If there is no way to simulate the cancellation currently, is there any plan
> to support this functionality? Or, is this just not part of the design or
> desired uses of SparkContext?
> 
> Thanks!
> 
> Mingyu




Re: Job cancellation

Posted by Mark Hamstra <ma...@clearstorydata.com>.
Job cancellation has been in both 0.8.1 SNAPSHOT and 0.9.0 SNAPSHOT for
awhile now: PR29 <https://github.com/apache/incubator-spark/pull/29>,
PR74<https://github.com/apache/incubator-spark/pull/74>.
 Modification/improvement of job cancellation is part of the open pull
request PR190 <https://github.com/apache/incubator-spark/pull/190>.


On Wed, Nov 20, 2013 at 3:39 AM, Mingyu Kim <mk...@palantir.com> wrote:

> Hi all,
>
> Cancellation seems to be supported at application level. In other words,
> you can call stop() on your instance of SparkContext in order to stop the
> computation associated with the SparkContext. Is there any way to cancel a
> job? (To be clear, job is "a parallel computation consisting of multiple
> tasks that gets spawned in response to a Spark action” as defined on the
> Spark website.) The current RDD API doesn’t seem to provide this
> functionality, but I’m wondering if there is any way to do anything
> similar. I’d like to be able to cancel a long-running job that is found to
> be unnecessary without shutting down the SparkContext.
>
> If there is no way to simulate the cancellation currently, is there any
> plan to support this functionality? Or, is this just not part of the design
> or desired uses of SparkContext?
>
> Thanks!
>
> Mingyu
>