You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/06/28 03:44:00 UTC

[jira] [Commented] (SPARK-30798) Scope Session.active in QueryExecution

    [ https://issues.apache.org/jira/browse/SPARK-30798?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17147189#comment-17147189 ] 

Apache Spark commented on SPARK-30798:
--------------------------------------

User 'xuanyuanking' has created a pull request for this issue:
https://github.com/apache/spark/pull/28936

> Scope Session.active in QueryExecution
> --------------------------------------
>
>                 Key: SPARK-30798
>                 URL: https://issues.apache.org/jira/browse/SPARK-30798
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Ali Afroozeh
>            Assignee: Ali Afroozeh
>            Priority: Major
>             Fix For: 3.0.0
>
>
> SparkSession.active is a thread local variable that points to the current thread's spark session. It is important to note that the SQLConf.get method depends on SparkSession.active. In the current implementation it is possible that SparkSession.active points to a different session which causes various problems. Most of these problems arise because part of the query processing is done using the configurations of a different session. For example, when creating a data frame using a new session, i.e., session.sql("..."), part of the data frame is constructed using the currently active spark session, which can be a different session from the one used later for processing the query.
> This PR scopes SparkSession.active to prevent the above-mentioned problems. A new method, withActive is introduced on SparkSession that restores the previous spark session after the block of code is executed.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org