You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/03/01 23:52:02 UTC

[GitHub] [spark] britishbadger commented on issue #24807: [SPARK-27958][SQL] Stopping a SparkSession should not always stop Spark Context

britishbadger commented on issue #24807: [SPARK-27958][SQL] Stopping a SparkSession should not always stop Spark Context
URL: https://github.com/apache/spark/pull/24807#issuecomment-593166681
 
 
   Did this get merged in? We are using newSession() to ensure that temporary views are isolated. We are calling a spark application from a rest service and newSession for our workloads works really nicely but when profiling the java process I can see the SparkSession objects are never released or garbaged collected. I then found this PR but I'd agree not having a method to kill the session when you are done seems to be a leak. stop() closes the context not the session. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org