You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Alex Favaro (Jira)" <ji...@apache.org> on 2020/02/17 16:27:00 UTC

[jira] [Created] (SPARK-30856) SQLContext retains reference to unusable instance after SparkContext restarted

Alex Favaro created SPARK-30856:
-----------------------------------

             Summary: SQLContext retains reference to unusable instance after SparkContext restarted
                 Key: SPARK-30856
                 URL: https://issues.apache.org/jira/browse/SPARK-30856
             Project: Spark
          Issue Type: Bug
          Components: PySpark, SQL
    Affects Versions: 2.4.5
            Reporter: Alex Favaro


When the underlying SQLContext is instantiated for a SparkSession, the instance is saved as a class attribute and returned from subsequent calls to SQLContext.getOrCreate(). If the SparkContext is stopped and a new one started, the SQLContext class attribute is never cleared so any code which calls SQLContext.getOrCreate() will get a SQLContext with a reference to the old, unusable SparkContext.

A similar issue was identified and fixed for SparkSession in SPARK-19055, but the fix did not change SQLContext as well. I ran into this because mllib still [uses|https://github.com/apache/spark/blob/master/python/pyspark/mllib/common.py#L105] SQLContext.getOrCreate() under the hood.

I've already written a fix for this, which I'll be sharing in a PR, that clears the class attribute on SQLContext when the SparkSession is stopped. Another option would be to deprecate SQLContext.getOrCreate() entirely since the corresponding Scala [method|https://spark.apache.org/docs/latest/api/java/org/apache/spark/sql/SQLContext.html#getOrCreate-org.apache.spark.SparkContext-] is itself deprecated. That seems like a larger change for a relatively minor issue, however.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org