You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/02/18 20:31:17 UTC

[GitHub] [spark] zsxwing commented on a change in pull request #35410: [SPARK-38121][PYTHON][SQL] Use SparkSession instead of SQLContext inside PySpark

zsxwing commented on a change in pull request #35410:
URL: https://github.com/apache/spark/pull/35410#discussion_r810321103



##########
File path: python/pyspark/sql/dataframe.py
##########
@@ -116,13 +140,45 @@ def __init__(self, jdf: JavaObject, sql_ctx: "SQLContext"):
         # by __repr__ and _repr_html_ while eager evaluation opened.
         self._support_repr_html = False
 
+    @property
+    def sql_ctx(self) -> "SQLContext":
+        from pyspark.sql.context import SQLContext
+
+        warnings.warn(
+            "DataFrame.sql_ctx is an internal property, and will be removed "
+            "in future releases. Use DataFrame.sparkSession instead."
+        )
+        if self._sql_ctx is None:
+            self._sql_ctx = SQLContext._get_or_create(self._sc)
+        return self._sql_ctx
+
+    @property  # type: ignore[misc]
+    def sparkSession(self) -> "SparkSession":
+        """Returns Spark session that created this :class:`DataFrame`.
+
+        .. versionadded:: 3.3.0
+
+        Examples
+        --------
+        >>> df = spark.range(1)
+        >>> type(df.sparkSession)
+        <class 'pyspark.sql.session.SparkSession'>
+        """
+        from pyspark.sql.session import SparkSession
+
+        if self._session is None:
+            self._session = SparkSession._getActiveSessionOrCreate()

Review comment:
       This looks like not correct. The current DataFrame's `sparkSession` may not be the active session. Why not get the session from `self.__sql_ctx` and assign it to `self._session` in the constructor?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org