You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "allisonwang-db (via GitHub)" <gi...@apache.org> on 2023/08/07 18:49:31 UTC

[GitHub] [spark] allisonwang-db commented on a diff in pull request #42371: [SPARK-44694][PYTHON][CONNECT] Refactor active sessions and expose them as an API

allisonwang-db commented on code in PR #42371:
URL: https://github.com/apache/spark/pull/42371#discussion_r1286270191


##########
python/pyspark/sql/connect/session.py:
##########
@@ -236,6 +237,38 @@ def __init__(self, connection: Union[str, ChannelBuilder], userId: Optional[str]
         self._client = SparkConnectClient(connection=connection, user_id=userId)
         self._session_id = self._client._session_id
 
+    @classmethod
+    def _set_default_and_active_session(cls, session: "SparkSession") -> None:
+        """
+        Set the (global) default :class:`SparkSession`, and (thread-local)
+        active :class:`SparkSession` when they are not set yet.
+        """
+        with cls._lock:
+            if cls._default_session is None:
+                cls._default_session = session
+        if getattr(cls._active_session, "session", None) is None:
+            cls._active_session.session = session
+
+    @classmethod
+    def getActiveSession(cls) -> Optional["SparkSession"]:

Review Comment:
   Is this also a public API? I found it a bit confusing that this `.getActiveSession()` returns the active session but `.active()` returns the active or the default session (and throws an exception if no active session). Is the API the same as non Spark Connect one?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org