You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/01/23 13:04:34 UTC

[jira] [Commented] (SPARK-960) JobCancellationSuite "two jobs sharing the same stage" is broken

    [ https://issues.apache.org/jira/browse/SPARK-960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14289136#comment-14289136 ] 

Apache Spark commented on SPARK-960:
------------------------------------

User 'srowen' has created a pull request for this issue:
https://github.com/apache/spark/pull/4180

> JobCancellationSuite "two jobs sharing the same stage" is broken
> ----------------------------------------------------------------
>
>                 Key: SPARK-960
>                 URL: https://issues.apache.org/jira/browse/SPARK-960
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.8.1, 0.9.0
>            Reporter: Mark Hamstra
>
> This test doesn't work as it appears to be intended since the map tasks can never acquire sem2.  The simplest way to demonstrate this is to comment out f1.cancel() in the future.  I believe the intention is that f1 and f2 would then complete normally; but they won't.  Instead, both jobs block, waiting on sem2.  It doesn't look like closing over Semaphores works even in a Local context, since sem2.hashCode() is different in each of f1, f2 and in the future containing f1.cancel, so the map jobs never see the sem2.release(10) in the future.
> Instead, the test only completes because all of the stages (the two final stages and the common dependent stage) get cancelled and aborted.  When job <--> stage dependencies are fully accounted for and job cancellation changed so that f1.cancel does not abort the common stage, then this test can never finish since it then becomes hung waiting on sem2.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org