You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/18 08:23:23 UTC

[GitHub] [spark] uncleGen opened a new pull request #24400: [SPARK-27503][DStream] JobGenerator thread exit for some fatal errors but application keeps running

uncleGen opened a new pull request #24400: [SPARK-27503][DStream] JobGenerator thread exit for some fatal errors but application keeps running
URL: https://github.com/apache/spark/pull/24400
 
 
   ## What changes were proposed in this pull request?
   
   JobGenerator thread (including some other EventLoop threads) may exit for some fatal error, like OOM, but Spark Streaming job keep running with no batch job generating. Currently, we only report any non-fatal error. 
   
   ```
   override def run(): Unit = {
         try {
           while (!stopped.get) {
             val event = eventQueue.take()
             try {
               onReceive(event)
             } catch {
               case NonFatal(e) =>
                 try {
                   onError(e)
                 } catch {
                   case NonFatal(e) => logError("Unexpected error in " + name, e)
                 }
             }
           }
         } catch {
           case ie: InterruptedException => // exit even if eventQueue is not empty
           case NonFatal(e) => logError("Unexpected error in " + name, e)
         }
       }
   ```
   
   In some corner cases, these event threads may exit with OOM error, but driver thread can still keep running.
   
   ## How was this patch tested?
   
   existing unit tests

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org