You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2022/08/10 01:00:45 UTC

[spark] branch branch-3.2 updated: [SPARK-40022][YARN][TESTS] Ignore pyspark suites in `YarnClusterSuite` when python3 is unavailable

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.2 by this push:
     new a1fd1a2da56 [SPARK-40022][YARN][TESTS] Ignore pyspark suites in `YarnClusterSuite` when python3 is unavailable
a1fd1a2da56 is described below

commit a1fd1a2da56f521563992d7490e8dc1e6cac5d18
Author: yangjie01 <ya...@baidu.com>
AuthorDate: Wed Aug 10 09:59:35 2022 +0900

    [SPARK-40022][YARN][TESTS] Ignore pyspark suites in `YarnClusterSuite` when python3 is unavailable
    
    ### What changes were proposed in this pull request?
    This pr adds `assume(isPythonAvailable)`  to `testPySpark` method in `YarnClusterSuite` to make `YarnClusterSuite` test succeeded in an environment without Python 3 configured.
    
    ### Why are the changes needed?
    `YarnClusterSuite` should not `ABORTED` when `python3` is not configured.
    
    ### Does this PR introduce _any_ user-facing change?
    No.
    
    ### How was this patch tested?
    
    - Pass GitHub Actions
    - Manual test
    
    Run
    
    ```
    mvn clean test -pl resource-managers/yarn -am -Pyarn -DwildcardSuites=org.apache.spark.deploy.yarn.YarnClusterSuite  -Dtest=none
    ```
    in an environment without Python 3 configured:
    
    **Before**
    
    ```
    YarnClusterSuite:
    org.apache.spark.deploy.yarn.YarnClusterSuite *** ABORTED ***
      java.lang.RuntimeException: Unable to load a Suite class that was discovered in the runpath: org.apache.spark.deploy.yarn.YarnClusterSuite
      at org.scalatest.tools.DiscoverySuite$.getSuiteInstance(DiscoverySuite.scala:81)
      at org.scalatest.tools.DiscoverySuite.$anonfun$nestedSuites$1(DiscoverySuite.scala:38)
      at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
      at scala.collection.Iterator.foreach(Iterator.scala:943)
      at scala.collection.Iterator.foreach$(Iterator.scala:943)
      at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
      at scala.collection.IterableLike.foreach(IterableLike.scala:74)
      at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
      at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
      at scala.collection.TraversableLike.map(TraversableLike.scala:286)
      ...
    Run completed in 833 milliseconds.
    Total number of tests run: 0
    Suites: completed 1, aborted 1
    Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
    *** 1 SUITE ABORTED ***
    ```
    
    **After**
    
    ```
    YarnClusterSuite:
    - run Spark in yarn-client mode
    - run Spark in yarn-cluster mode
    - run Spark in yarn-client mode with unmanaged am
    - run Spark in yarn-client mode with different configurations, ensuring redaction
    - run Spark in yarn-cluster mode with different configurations, ensuring redaction
    - yarn-cluster should respect conf overrides in SparkHadoopUtil (SPARK-16414, SPARK-23630)
    - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'local'
    - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local'
    - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'local' and gateway-replacement path
    - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local' and gateway-replacement path
    - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'local' and gateway-replacement path containing an environment variable
    - SPARK-35672: run Spark in yarn-client mode with additional jar using URI scheme 'file'
    - SPARK-35672: run Spark in yarn-cluster mode with additional jar using URI scheme 'file'
    - run Spark in yarn-cluster mode unsuccessfully
    - run Spark in yarn-cluster mode failure after sc initialized
    - run Python application in yarn-client mode !!! CANCELED !!!
      YarnClusterSuite.this.isPythonAvailable was false (YarnClusterSuite.scala:376)
    - run Python application in yarn-cluster mode !!! CANCELED !!!
      YarnClusterSuite.this.isPythonAvailable was false (YarnClusterSuite.scala:376)
    - run Python application in yarn-cluster mode using spark.yarn.appMasterEnv to override local envvar !!! CANCELED !!!
      YarnClusterSuite.this.isPythonAvailable was false (YarnClusterSuite.scala:376)
    - user class path first in client mode
    - user class path first in cluster mode
    - monitor app using launcher library
    - running Spark in yarn-cluster mode displays driver log links
    - timeout to get SparkContext in cluster mode triggers failure
    - executor env overwrite AM env in client mode
    - executor env overwrite AM env in cluster mode
    - SPARK-34472: ivySettings file with no scheme or file:// scheme should be localized on driver in cluster mode
    - SPARK-34472: ivySettings file with no scheme or file:// scheme should retain user provided path in client mode
    - SPARK-34472: ivySettings file with non-file:// schemes should throw an error
    Run completed in 7 minutes, 2 seconds.
    Total number of tests run: 25
    Suites: completed 2, aborted 0
    Tests: succeeded 25, failed 0, canceled 3, ignored 0, pending 0
    All tests passed.
    ```
    
    Closes #37454 from LuciferYang/yarnclustersuite.
    
    Authored-by: yangjie01 <ya...@baidu.com>
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
    (cherry picked from commit 8e472443081342a0e0dc37aa154e30a0a6df39b7)
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
 .../scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala  | 10 ++++++----
 1 file changed, 6 insertions(+), 4 deletions(-)

diff --git a/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala b/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
index 26ff3bf2971..8a0d98bc286 100644
--- a/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
+++ b/resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnClusterSuite.scala
@@ -53,11 +53,12 @@ import org.apache.spark.util.{Utils, YarnContainerInfoHelper}
 @ExtendedYarnTest
 class YarnClusterSuite extends BaseYarnClusterSuite {
 
-  private val pythonExecutablePath = {
+  private val (isPythonAvailable, pythonExecutablePath) = {
     // To make sure to use the same Python executable.
-    val maybePath = TestUtils.getAbsolutePathFromExecutable("python3")
-    assert(maybePath.isDefined)
-    maybePath.get
+    TestUtils.getAbsolutePathFromExecutable("python3") match {
+      case Some(path) => (true, path)
+      case _ => (false, "")
+    }
   }
 
   override def newYarnConfig(): YarnConfiguration = new YarnConfiguration()
@@ -302,6 +303,7 @@ class YarnClusterSuite extends BaseYarnClusterSuite {
       clientMode: Boolean,
       extraConf: Map[String, String] = Map(),
       extraEnv: Map[String, String] = Map()): Unit = {
+    assume(isPythonAvailable)
     val primaryPyFile = new File(tempDir, "test.py")
     Files.write(TEST_PYFILE, primaryPyFile, StandardCharsets.UTF_8)
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org