You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/01/03 03:36:58 UTC

[jira] [Commented] (SPARK-19055) SparkSession initialization will be associated with invalid SparkContext when new SparkContext is created to replace stopped SparkContext

    [ https://issues.apache.org/jira/browse/SPARK-19055?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15794029#comment-15794029 ] 

Apache Spark commented on SPARK-19055:
--------------------------------------

User 'viirya' has created a pull request for this issue:
https://github.com/apache/spark/pull/16454

> SparkSession initialization will be associated with invalid SparkContext when new SparkContext is created to replace stopped SparkContext
> -----------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19055
>                 URL: https://issues.apache.org/jira/browse/SPARK-19055
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, SQL
>            Reporter: Liang-Chi Hsieh
>
> In SparkSession initialization, we store created the instance of SparkSession into a class variable _instantiatedContext. Next time we can use SparkSession.builder.getOrCreate() to retrieve the existing SparkSession instance.
> However, when the active SparkContext is stopped and we create another new SparkContext to use, the existing SparkSession is still associated with the stopped SparkContext. So the operations with this existing SparkSession will be failed.
> We need to detect such case in SparkSession and renew the class variable _instantiatedContext if needed.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org