You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2020/02/20 03:23:00 UTC
[jira] [Resolved] (SPARK-30856) SQLContext retains reference to
unusable instance after SparkContext restarted
[ https://issues.apache.org/jira/browse/SPARK-30856?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-30856.
----------------------------------
Fix Version/s: 3.1.0
Resolution: Fixed
Issue resolved by pull request 27610
[https://github.com/apache/spark/pull/27610]
> SQLContext retains reference to unusable instance after SparkContext restarted
> ------------------------------------------------------------------------------
>
> Key: SPARK-30856
> URL: https://issues.apache.org/jira/browse/SPARK-30856
> Project: Spark
> Issue Type: Bug
> Components: PySpark, SQL
> Affects Versions: 2.4.5
> Reporter: Alex Favaro
> Priority: Major
> Fix For: 3.1.0
>
>
> When the underlying SQLContext is instantiated for a SparkSession, the instance is saved as a class attribute and returned from subsequent calls to SQLContext.getOrCreate(). If the SparkContext is stopped and a new one started, the SQLContext class attribute is never cleared so any code which calls SQLContext.getOrCreate() will get a SQLContext with a reference to the old, unusable SparkContext.
> A similar issue was identified and fixed for SparkSession in SPARK-19055, but the fix did not change SQLContext as well. I ran into this because mllib still [uses|https://github.com/apache/spark/blob/master/python/pyspark/mllib/common.py#L105] SQLContext.getOrCreate() under the hood.
> I've already written a fix for this, which I'll be sharing in a PR, that clears the class attribute on SQLContext when the SparkSession is stopped. Another option would be to deprecate SQLContext.getOrCreate() entirely since the corresponding Scala [method|https://spark.apache.org/docs/latest/api/java/org/apache/spark/sql/SQLContext.html#getOrCreate-org.apache.spark.SparkContext-] is itself deprecated. That seems like a larger change for a relatively minor issue, however.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org