You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2022/01/08 17:30:14 UTC

[spark] branch master updated: [SPARK-37844][CORE][TESTS] Remove `slf4j-log4j12` transitive test dependency from `hadoop-minikdc`

This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new ef5d1f9  [SPARK-37844][CORE][TESTS] Remove `slf4j-log4j12` transitive test dependency from `hadoop-minikdc`
ef5d1f9 is described below

commit ef5d1f98c771256ef57120f125d8cbc9c5be2e62
Author: Dongjoon Hyun <do...@apache.org>
AuthorDate: Sat Jan 8 09:28:24 2022 -0800

    [SPARK-37844][CORE][TESTS] Remove `slf4j-log4j12` transitive test dependency from `hadoop-minikdc`
    
    ### What changes were proposed in this pull request?
    
    This PR removes `slf4j-log4j12` dependency from `hadoop-minikdc`.
    
    ### Why are the changes needed?
    
    This causes `Maven` test failure.
    
    **BEFORE**
    ```
    $ build/mvn -Dtest=none -DwildcardSuites=org.apache.spark.deploy.SparkSubmitSuite test
    ...
    00:09:19.585 - launch simple application with spark-submit *** FAILED ***
    00:10:17.712   Timeout of '/Users/m1/.jenkins/workspace/master-sbt/bin/spark-submit' '--class' 'org.apache.spark.deploy.SimpleApplicationTest' '--name' 'testApp' '--master' 'local' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' 'file:/Users/m1/.jenkins/workspace/master-sbt/core/target/tmp/spark-9bc87b17-c585-4661-adf4-f51a3c01586f/testJar-1641631487327.jar' See the log4j logs for more detail.
    00:10:17.712   2022-01-08 00:44:47.955 - stderr> SLF4J: Class path contains multiple SLF4J bindings.
    00:10:17.712   2022-01-08 00:44:47.955 - stderr> SLF4J: Found binding in [jar:file:/Users/m1/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    00:10:17.712   2022-01-08 00:44:47.956 - stderr> SLF4J: Found binding in [jar:file:/Users/m1/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.17.1/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    00:10:17.712   2022-01-08 00:44:47.956 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
    00:10:17.712   2022-01-08 00:44:47.956 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
    00:10:17.712   2022-01-08 00:44:49.246 - stderr> Exception in thread "Executor task launch worker for task 0.0 in stage 0.0 (TID 0)" java.lang.NoClassDefFoundError: Could not initialize class org.slf4j.MDC
    00:10:17.712   2022-01-08 00:44:49.246 - stderr> 	at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$setMDCForTask(Executor.scala:751)
    00:10:17.712   2022-01-08 00:44:49.246 - stderr> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:441)
    00:10:17.712   2022-01-08 00:44:49.246 - stderr> 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
    00:10:17.712   2022-01-08 00:44:49.246 - stderr> 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
    00:10:17.712   2022-01-08 00:44:49.246 - stderr> 	at java.base/java.lang.Thread.run(Thread.java:833) (SparkSubmitTestUtils.scala:107)
    ```
    
    ```
    [INFO] +- org.apache.hadoop:hadoop-minikdc:jar:3.3.1:test
    [INFO] |  +- org.apache.kerby:kerb-simplekdc:jar:1.0.1:test
    ...
    [INFO] |  |     \- org.apache.kerby:kerby-xdr:jar:1.0.1:test
    [INFO] |  \- org.slf4j:slf4j-log4j12:jar:1.7.30:test
    ```
    
    **AFTER**
    ```
    $ build/mvn -Dtest=none -DwildcardSuites=org.apache.spark.deploy.SparkSubmitSuite test
    ...
    SparkSubmitSuite:
    - prints usage on empty input
    - prints usage with only --help
    - prints error with unrecognized options
    - handle binary specified but not class
    - handles arguments with --key=val
    - handles arguments to user program
    - handles arguments to user program with name collision
    - print the right queue name
    - SPARK-24241: do not fail fast if executor num is 0 when dynamic allocation is enabled
    - specify deploy mode through configuration
    - handles YARN cluster mode
    - handles YARN client mode
    - SPARK-33530: handles standalone mode with archives
    - handles standalone cluster mode
    - handles legacy standalone cluster mode
    - handles standalone client mode
    - handles mesos client mode
    - handles k8s cluster mode
    - automatically sets mainClass if primary resource is S3 JAR in client mode
    - automatically sets mainClass if primary resource is S3 JAR in cluster mode
    - error informatively when mainClass isn't set and S3 JAR doesn't exist
    - handles confs with flag equivalents
    - SPARK-21568 ConsoleProgressBar should be enabled only in shells
    - launch simple application with spark-submit
    - launch simple application with spark-submit with redaction
    - includes jars passed in through --jars
    - includes jars passed through spark.jars.packages and spark.jars.repositories
    - correctly builds R packages included in a jar with --packages !!! IGNORED !!!
    - include an external JAR in SparkR !!! CANCELED !!!
      org.apache.spark.api.r.RUtils.isSparkRInstalled was false SparkR is not installed in this build. (SparkSubmitSuite.scala:740)
    - resolves command line argument paths correctly
    - ambiguous archive mapping results in error message
    - resolves config paths correctly
    - user classpath first in driver
    - SPARK_CONF_DIR overrides spark-defaults.conf
    - support glob path
    - SPARK-27575: yarn confs should merge new value with existing value
    - downloadFile - invalid url
    - downloadFile - file doesn't exist
    - downloadFile does not download local file
    - download one file to local
    - download list of files to local
    - remove copies of application jar from classpath
    - Avoid re-upload remote resources in yarn client mode
    - download remote resource if it is not supported by yarn service
    - avoid downloading remote resource if it is supported by yarn service
    - force download from forced schemes
    - force download for all the schemes
    - SPARK-32119: Jars and files should be loaded when Executors launch for plugins
    - start SparkApplication without modifying system properties
    - support --py-files/spark.submit.pyFiles in non pyspark application
    - handles natural line delimiters in --properties-file and --conf uniformly
    - get a Spark configuration from arguments
    Run completed in 25 seconds, 216 milliseconds.
    Total number of tests run: 51
    Suites: completed 2, aborted 0
    Tests: succeeded 51, failed 0, canceled 1, ignored 1, pending 0
    All tests passed.
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the test with Maven.
    
    Closes #35143 from dongjoon-hyun/SPARK-37844.
    
    Authored-by: Dongjoon Hyun <do...@apache.org>
    Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
 pom.xml | 4 ++++
 1 file changed, 4 insertions(+)

diff --git a/pom.xml b/pom.xml
index fdda77c..323d9cc 100644
--- a/pom.xml
+++ b/pom.xml
@@ -1365,6 +1365,10 @@
             <groupId>log4j</groupId>
             <artifactId>log4j</artifactId>
           </exclusion>
+          <exclusion>
+            <groupId>org.slf4j</groupId>
+            <artifactId>slf4j-log4j12</artifactId>
+          </exclusion>
         </exclusions>
       </dependency>
       <dependency>

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org