You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/16 03:06:03 UTC

[GitHub] [spark] LuciferYang commented on a diff in pull request #37487: [SPARK-40053][CORE][SQL][TESTS] Add `assume` to dynamic cancel cases which requiring Python runtime environment

LuciferYang commented on code in PR #37487:
URL: https://github.com/apache/spark/pull/37487#discussion_r946296623


##########
sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveExternalCatalogVersionsSuite.scala:
##########
@@ -201,7 +201,11 @@ class HiveExternalCatalogVersionsSuite extends SparkSubmitTestUtils {
     // scalastyle:on line.size.limit
 
     if (PROCESS_TABLES.testingVersions.isEmpty) {
-      logError("Fail to get the latest Spark versions to test.")
+      if (PROCESS_TABLES.isPythonVersionAtLeast37) {
+        logError("Fail to get the latest Spark versions to test.")
+      } else {
+        logError("Python version <  3.7.0, the running environment is unavailable.")

Review Comment:
   Yes, you understand right, this is exactly what I want to discuss:
   
   Should we only check the availability of python3, regardless of the version, or should we specify the minimum available version?
   
   I set the condition that `HiveExternalCatalogVersionsSuite` is allowed to be tested as `isPythonVersionAtLeast37 ` in this pr due to found `[Spark runs on Java 8/11/17, Scala 2.12/2.13, Python 3.7+ and R 3.5+.](https://github.com/apache/spark/blob/master/docs/index.md)` in document , so I also made `Python version < 3.7.0` clear in this message.  Under this condition, if the user uses `Python 3.0 ~ Python 3.6` for testing, they will clearly know that the test is ignored because the python version is too low and not Python 3 is unavailable
   
   
   
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org