You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/02/03 00:25:09 UTC
[spark] branch branch-3.4 updated: [SPARK-42295][CONNECT][TEST] Tear down the test cleanly
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new 17ab68e465c [SPARK-42295][CONNECT][TEST] Tear down the test cleanly
17ab68e465c is described below
commit 17ab68e465cfb9ccffaf3ec227a0353cee32545f
Author: Takuya UESHIN <ue...@databricks.com>
AuthorDate: Fri Feb 3 09:24:46 2023 +0900
[SPARK-42295][CONNECT][TEST] Tear down the test cleanly
### What changes were proposed in this pull request?
Tears down the test cleanly.
### Why are the changes needed?
Currently `SparkConnectSQLTestCase` doesn't tear down cleanly but calling `setUpClass` instead.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Manually.
Closes #39864 from ueshin/issues/SPARK-42295/tearDownClass.
Authored-by: Takuya UESHIN <ue...@databricks.com>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
(cherry picked from commit d740b437e17ce4b1bf38f577f7ef3d067bbd9e91)
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
python/pyspark/sql/tests/connect/test_connect_basic.py | 13 ++++++++-----
1 file changed, 8 insertions(+), 5 deletions(-)
diff --git a/python/pyspark/sql/tests/connect/test_connect_basic.py b/python/pyspark/sql/tests/connect/test_connect_basic.py
index 5d7d2cdc8fa..08fad856036 100644
--- a/python/pyspark/sql/tests/connect/test_connect_basic.py
+++ b/python/pyspark/sql/tests/connect/test_connect_basic.py
@@ -81,7 +81,7 @@ class SparkConnectSQLTestCase(ReusedConnectTestCase, SQLTestUtils, PandasOnSpark
# PySpark libraries.
os.environ["PYSPARK_NO_NAMESPACE_SHARE"] = "1"
- cls.connect = cls.spark # Switch Spark Connect session and regular PySpark sesion.
+ cls.connect = cls.spark # Switch Spark Connect session and regular PySpark session.
cls.spark = PySparkSession._instantiatedSession
assert cls.spark is not None
@@ -103,10 +103,13 @@ class SparkConnectSQLTestCase(ReusedConnectTestCase, SQLTestUtils, PandasOnSpark
@classmethod
def tearDownClass(cls):
- cls.spark_connect_clean_up_test_data()
- cls.spark = cls.connect # Stopping Spark Connect closes the session in JVM at the server.
- super(SparkConnectSQLTestCase, cls).setUpClass()
- del os.environ["PYSPARK_NO_NAMESPACE_SHARE"]
+ try:
+ cls.spark_connect_clean_up_test_data()
+ # Stopping Spark Connect closes the session in JVM at the server.
+ cls.spark = cls.connect
+ del os.environ["PYSPARK_NO_NAMESPACE_SHARE"]
+ finally:
+ super(SparkConnectSQLTestCase, cls).tearDownClass()
@classmethod
def spark_connect_load_test_data(cls):
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org