You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/04/10 01:24:10 UTC

[GitHub] [spark] kotlovs edited a comment on pull request #32081: [SPARK-34674][CORE][K8S] Close SparkContext after the Main method has finished

kotlovs edited a comment on pull request #32081:
URL: https://github.com/apache/spark/pull/32081#issuecomment-817048293


   @dongjoon-hyun, @mridulm  thanks a lot for helping with this issue!
   
   Do you mean that this build problem is only about Hive Thrift Server tests?
   Looking at the code of HiveThriftServer2, it seems that it works in this way: starting SparkContext and the Server, exit main(), and working as a server that processes incoming requests.
   And closing the SparkContext after the main() breaks this behavior.
   
   Below code doesn't look as a good solution, but for test purposes, I was be able to fix this test issue by code:
   ```
   if (args.mainClass != "org.apache.spark.sql.hive.thriftserver.HiveThriftServer2") {
     SparkContext.getActive.foreach(_.stop())
   }
   ```
   May be we can introduce another application arg, for example _--isServer_ (HiveThriftServer2 scripts will set it), and take it into account when closing the context?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org