You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/10/31 06:55:38 UTC

[GitHub] [spark] mridulm edited a comment on pull request #34098: [SPARK-36842][Core] TaskSchedulerImpl - stop TaskResultGetter properly

mridulm edited a comment on pull request #34098:
URL: https://github.com/apache/spark/pull/34098#issuecomment-955647214


   Thanks for digging more @lxian !
   Apologies for the delay in getting back on this; and to add to the answers to my queries.
   
   * Re: `mapOutputTracker.stop()` can throw `SparkException` in case of timeout
     * As @lxian pointed out, this cant happen now after Holden's changes (I think I might have been looking at a different branch, sorry for the confusion).
   * Re: `metricsSystem.stop()` could throw exception - depends on the sink.
     * As @lxian detailed, current spark Sink's should not cause this to happen. Having said that:
     * Spark supports plugging in custom Sink's : so looking only at what exists in our codebase is unfortunately insufficient.
       * An exception here prevents everything else in `SparkEnv.stop` from running
     *  To be defensive, handling this would be better - thoughts ?
   
   Both of these below are related to `InterruptedException`:
   * `blockManager.stop()` can throw `InterruptedException`
   * `rpcEnv.awaitTermination` could throw `InterruptedException`
   
   I agree with @lxian, that is not caught by `Utils.tryLogNonFatalError` anyway - so let us preserve existing behavior for that.
   
   Given the above, can we address the potential issue with Sink.close ?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org