You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by tdas <gi...@git.apache.org> on 2014/08/26 20:55:59 UTC

[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

GitHub user tdas opened a pull request:

    https://github.com/apache/spark/pull/2143

    [SPARK-3139] Made ContextCleaner to not block on shuffles

    As a workaround for SPARK-3015, the ContextCleaner was made "blocking", that is, it cleaned items one-by-one. But shuffles can take a long time to be deleted. Given that the RC for 1.1 is imminent, this PR makes a narrow change in the context cleaner - not wait for shuffle cleanups to complete. Also it changes the error messages on failure to delete to be milder warnings, as exceptions in the delete code path for one item does not really stop the actual functioning of the system.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/tdas/spark cleaner-shuffle-fix

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/2143.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2143
    
----
commit 387b5787fcff0903770fc6dfea082dd2c4aca756
Author: Tathagata Das <ta...@gmail.com>
Date:   2014-08-26T18:43:09Z

    Made ContextCleaner to not block on shuffles

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53508574
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19248/consoleFull) for   PR 2143 at commit [`2181329`](https://github.com/apache/spark/commit/2181329d704b7aa79612c2026bca04e5b7fd8e8c).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16754923
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    @andrewor14 if you look at where it is used, that's actually the behavior. That said, it would be a bit simpler to set the precedence where it is defined rather than where it is used.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by tdas <gi...@git.apache.org>.
Github user tdas commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53476184
  
    Jenkins, test this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by tdas <gi...@git.apache.org>.
Github user tdas commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53530257
  
    The failed unit test is unrelated to this. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16757841
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    I'd prefer to keep it as-is and just have this be undocumented for now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16734948
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether to disable blocking on shuffle tasks. This override is effective only when
    +   * blocking on cleanup tasks is enabled.
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter disables blocking on shuffle cleanups when when `blockOnCleanupTasks` is true).
    +   * Note that this does not affect the cleanup of RDDs and broadcasts.
    +   * This is intended to be a temporary workaround, until the real Akka issue (referred to in
    +   * the comment above `blockOnCleanupTasks`) is resolved.
    +   */
    +  private val disableBlockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.disableBlockingForShuffles", true)
    --- End diff --
    
    I like this idea.  Note that this reverses the effect of that variable, so the default should now be `false` (since we block for all cleaning operations by default but _not_ for shuffles).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/2143


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by tdas <gi...@git.apache.org>.
Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16756087
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    Though, @pwendell consider @andrewor14 suggestion of having separate parameters
    `spark.cleaner.referenceTracking.blocking.{rdd/shuffle/broadcast}` . I am on the fence.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16756688
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    I'm fine with the changes as is. This is not a huge deal since we don't expose it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by tdas <gi...@git.apache.org>.
Github user tdas commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53471207
  
    @pwendell @JoshRosen @andrewor14


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53476837
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19231/consoleFull) for   PR 2143 at commit [`e337cc2`](https://github.com/apache/spark/commit/e337cc2d6731ed709afe78a958242abd89c3df56).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53523551
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19276/consoleFull) for   PR 2143 at commit [`9c84202`](https://github.com/apache/spark/commit/9c84202631ccc82a99179e7a9dbfdff3a1d32c55).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53472382
  
    Small piece of feedback on the parameter naming - what do you think @JoshRosen 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53527699
  
    Jenkins, retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16754019
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    --- End diff --
    
    maybe say "this overrides the global setting `spark.cleaner.referenceTracking.blocking`" instead? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by tdas <gi...@git.apache.org>.
Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16735777
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether to disable blocking on shuffle tasks. This override is effective only when
    +   * blocking on cleanup tasks is enabled.
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter disables blocking on shuffle cleanups when when `blockOnCleanupTasks` is true).
    +   * Note that this does not affect the cleanup of RDDs and broadcasts.
    +   * This is intended to be a temporary workaround, until the real Akka issue (referred to in
    +   * the comment above `blockOnCleanupTasks`) is resolved.
    +   */
    +  private val disableBlockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.disableBlockingForShuffles", true)
    --- End diff --
    
    Yes, I was following Josh's argument. I found this more intuitive, but I have no strong opinions. Whatever you guys tell me. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by tdas <gi...@git.apache.org>.
Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16756006
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    Spoke to @andrewor14 and found potential issue in unit tests. The unit tests MAY fail, as the unit test assumed everything was blocking. But we are changing that assumption. Updating the PR right now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by tdas <gi...@git.apache.org>.
Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16736215
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether to disable blocking on shuffle tasks. This override is effective only when
    +   * blocking on cleanup tasks is enabled.
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter disables blocking on shuffle cleanups when when `blockOnCleanupTasks` is true).
    +   * Note that this does not affect the cleanup of RDDs and broadcasts.
    +   * This is intended to be a temporary workaround, until the real Akka issue (referred to in
    +   * the comment above `blockOnCleanupTasks`) is resolved.
    +   */
    +  private val disableBlockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.disableBlockingForShuffles", true)
    --- End diff --
    
    Ummm, what will be semantics? What does it mean if `spark.cleaner.referenceTracking.blocking` is false, but `spark.cleaner.referenceTracking.blocking.shuffle` is true?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16754201
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    So that means `spark.cleaner.referenceTracking.blocking` actually has no effect on the shuffles ever. Then I think it might make sense to separate this out into `spark.cleaner.referenceTracking.blocking.{rdd/shuffle/broadcast}`? It's just a little confusing right now because I would imagine that `spark.cleaner.referenceTracking.blocking` also controls the shuffle behavior if the shuffle-specific config is not set.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16755854
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    gotcha - I think it's fine as-is. This is not a user visible config anywyas.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16735808
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether to disable blocking on shuffle tasks. This override is effective only when
    +   * blocking on cleanup tasks is enabled.
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter disables blocking on shuffle cleanups when when `blockOnCleanupTasks` is true).
    +   * Note that this does not affect the cleanup of RDDs and broadcasts.
    +   * This is intended to be a temporary workaround, until the real Akka issue (referred to in
    +   * the comment above `blockOnCleanupTasks`) is resolved.
    +   */
    +  private val disableBlockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.disableBlockingForShuffles", true)
    --- End diff --
    
    +1 This is consistent with configs in other places where the more specific one overrides the general one. Also, everywhere else we only have `enable` configs, so having a `disable` one is not intuitive to me.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16754954
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    Oh wait - I guess that's impossible if we want the default to be true for one and false for the other.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53530699
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19296/consoleFull) for   PR 2143 at commit [`9c84202`](https://github.com/apache/spark/commit/9c84202631ccc82a99179e7a9dbfdff3a1d32c55).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16754934
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    oh wait - sorry I was looking at an older patch. @tdas I agree, this should fall back to the value of `spark.cleaner.referenceTracking.blocking` if `spark.cleaner.referenceTracking.blocking.shuffle` is not set.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53512306
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19248/consoleFull) for   PR 2143 at commit [`2181329`](https://github.com/apache/spark/commit/2181329d704b7aa79612c2026bca04e5b7fd8e8c).
     * This patch **passes** unit tests.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `protected class AttributeEquals(val a: Attribute) `



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53530260
  
    Jenkins, test this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16734810
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether to disable blocking on shuffle tasks. This override is effective only when
    +   * blocking on cleanup tasks is enabled.
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter disables blocking on shuffle cleanups when when `blockOnCleanupTasks` is true).
    +   * Note that this does not affect the cleanup of RDDs and broadcasts.
    +   * This is intended to be a temporary workaround, until the real Akka issue (referred to in
    +   * the comment above `blockOnCleanupTasks`) is resolved.
    +   */
    +  private val disableBlockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.disableBlockingForShuffles", true)
    --- End diff --
    
    Rather than having one option and then another option that disables the first option. What about just having this.
    
    ```
    # Existing option
    spark.cleaner.referenceTracking.blocking
    
    # More specific option for shuffles
    spark.cleaner.referenceTracking.blocking.shuffles
    ```
    
    This is the same style as the delay scheduling options.
    
    ```
    spark.locality.wait
    spark.locality.wait.process
    spark.locality.wait.node
    ...
    ```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53473285
  
    @pwendell I like your naming suggestion.
    
    This looks good to me.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53534279
  
    Okay thanks TD and Andrew - I'm pulling this in.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53486768
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19231/consoleFull) for   PR 2143 at commit [`e337cc2`](https://github.com/apache/spark/commit/e337cc2d6731ed709afe78a958242abd89c3df56).
     * This patch **passes** unit tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by tdas <gi...@git.apache.org>.
Github user tdas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16754997
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround,
    +   * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    Then we need enforce the default behavior of shuffle being non=blocking by setting "spark.cleaner.referenceTracking.blocking.shuffle = false" somewhere. Because if we do 
    `blockOnShuffleCleanupTasks = sc.conf.getBoolean("spark.cleaner.referenceTracking.blocking.shuffle", blockOnCleanupTasks)` based on intuition, then the default behavior would be blocking on shuffle. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53527919
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19289/consoleFull) for   PR 2143 at commit [`9c84202`](https://github.com/apache/spark/commit/9c84202631ccc82a99179e7a9dbfdff3a1d32c55).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53530146
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19289/consoleFull) for   PR 2143 at commit [`9c84202`](https://github.com/apache/spark/commit/9c84202631ccc82a99179e7a9dbfdff3a1d32c55).
     * This patch **fails** unit tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53534120
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19296/consoleFull) for   PR 2143 at commit [`9c84202`](https://github.com/apache/spark/commit/9c84202631ccc82a99179e7a9dbfdff3a1d32c55).
     * This patch **passes** unit tests.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `rem In this case, leave out the main class (org.apache.spark.deploy.SparkSubmit) and use our own.`



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53471880
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19227/consoleFull) for   PR 2143 at commit [`387b578`](https://github.com/apache/spark/commit/387b5787fcff0903770fc6dfea082dd2c4aca756).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-3139] Made ContextCleaner to not block ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2143#issuecomment-53525603
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/19276/consoleFull) for   PR 2143 at commit [`9c84202`](https://github.com/apache/spark/commit/9c84202631ccc82a99179e7a9dbfdff3a1d32c55).
     * This patch **fails** unit tests.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `"$FWDIR"/bin/spark-submit --class $CLASS "$`
      * `class ExternalSorter(object):`
      * `"$FWDIR"/bin/spark-submit --class $CLASS "$`
      * `protected class AttributeEquals(val a: Attribute) `



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org