You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2019/03/20 22:56:20 UTC

[spark] branch master updated: [SPARK-27205][CORE] Remove complicated logic for just leaving warning log when main class is scala.App

This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new a8d9531  [SPARK-27205][CORE] Remove complicated logic for just leaving warning log when main class is scala.App
a8d9531 is described below

commit a8d9531edc80116d4549efd62af116a90256864b
Author: Jungtaek Lim (HeartSaVioR) <ka...@gmail.com>
AuthorDate: Wed Mar 20 17:55:48 2019 -0500

    [SPARK-27205][CORE] Remove complicated logic for just leaving warning log when main class is scala.App
    
    ## What changes were proposed in this pull request?
    
    [SPARK-26977](https://issues.apache.org/jira/browse/SPARK-26977) introduced very strange bug which spark-shell is no longer able to load classes which are provided via `--packages`. TBH I don't know about the details why it is broken, but looks like initializing `object class` brings the weirdness (maybe due to static initialization done twice?).
    
    This patch removes the logic to leave warning log when main class is scala.App, to not deal with such complexity for just leaving warning message.
    
    ## How was this patch tested?
    
    Manual test: suppose we run spark-shell with `--packages` option like below:
    
    ```
    ./bin/spark-shell --verbose   --master "local[*]" --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0
    ```
    
    Before this patch, importing class in transitive dependency fails:
    
    ```
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    Spark context Web UI available at http://localhost:4040
    Spark context available as 'sc' (master = local[*], app id = local-1553005771597).
    Spark session available as 'spark'.
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 3.0.0-SNAPSHOT
          /_/
    
    Using Scala version 2.12.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_191)
    Type in expressions to have them evaluated.
    Type :help for more information.
    
    scala> import org.apache.kafka
    <console>:23: error: object kafka is not a member of package org.apache
           import org.apache.kafka
    ```
    
    After this patch, importing class in transitive dependency succeeds:
    
    ```
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    Spark context Web UI available at http://localhost:4040
    Spark context available as 'sc' (master = local[*], app id = local-1553004095542).
    Spark session available as 'spark'.
    Welcome to
          ____              __
         / __/__  ___ _____/ /__
        _\ \/ _ \/ _ `/ __/  '_/
       /___/ .__/\_,_/_/ /_/\_\   version 3.0.0-SNAPSHOT
          /_/
    
    Using Scala version 2.12.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_191)
    Type in expressions to have them evaluated.
    Type :help for more information.
    
    scala> import org.apache.kafka
    import org.apache.kafka
    ```
    
    Closes #24147 from HeartSaVioR/SPARK-27205.
    
    Authored-by: Jungtaek Lim (HeartSaVioR) <ka...@gmail.com>
    Signed-off-by: Sean Owen <se...@databricks.com>
---
 .../src/main/scala/org/apache/spark/deploy/SparkSubmit.scala | 12 ------------
 1 file changed, 12 deletions(-)

diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
index 493cad0..b6673e4 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
@@ -826,18 +826,6 @@ private[spark] class SparkSubmit extends Logging {
     val app: SparkApplication = if (classOf[SparkApplication].isAssignableFrom(mainClass)) {
       mainClass.getConstructor().newInstance().asInstanceOf[SparkApplication]
     } else {
-      // Scala object subclassing scala.App has its whole class body executed in the
-      // main method it inherits. Fields of the object will not have been initialized
-      // before the main method has been executed, which will cause problems like SPARK-4170
-      // Note two Java classes are generated, the childMainClass and childMainClass$.
-      // Users will pass in childMainClass which will delegate all invocations to childMainClass$
-      // but it's childMainClass$ that subclasses scala.App and we should check for.
-      Try {
-        if (classOf[scala.App].isAssignableFrom(Utils.classForName(s"$childMainClass$$"))) {
-          logWarning("Subclasses of scala.App may not work correctly. " +
-            "Use a main() method instead.")
-        }
-      }
       new JavaMainApplication(mainClass)
     }
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org