You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yang Jie (Jira)" <ji...@apache.org> on 2022/08/16 07:40:00 UTC
[jira] [Updated] (SPARK-40101) `include an external JAR in SparkR` in core module but need antlr4
[ https://issues.apache.org/jira/browse/SPARK-40101?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yang Jie updated SPARK-40101:
-----------------------------
Summary: `include an external JAR in SparkR` in core module but need antlr4 (was: `include an external JAR in SparkR` in core module but need antlr)
> `include an external JAR in SparkR` in core module but need antlr4
> ------------------------------------------------------------------
>
> Key: SPARK-40101
> URL: https://issues.apache.org/jira/browse/SPARK-40101
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, Tests
> Affects Versions: 3.4.0
> Reporter: Yang Jie
> Priority: Major
>
> Run following commands:
>
> {code:java}
> mvn clean -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
> mvn clean install -DskipTests -pl core -am
> mvn clean test -pl core -Dtest.exclude.tags=org.apache.spark.tags.ExtendedLevelDBTest -Dtest=none -DwildcardSuites=org.apache.spark.deploy.SparkSubmitSuite {code}
>
>
> `include an external JAR in SparkR` failed as follows:
>
> {code:java}
> include an external JAR in SparkR *** FAILED ***
> spark-submit returned with exit code 1.
> Command line: '/Users/Spark/spark-source/bin/spark-submit' '--name' 'testApp' '--master' 'local' '--jars' 'file:/Users/Spark/spark-source/core/target/tmp/spark-e15e960c-1c10-44cd-99d4-f1905e4c18be/sparkRTestJar-1660632952368.jar' '--verbose' '--conf' 'spark.ui.enabled=false' '/Users/Spark/spark-source/R/pkg/tests/fulltests/jarTest.R'
>
> 2022-08-15 23:55:53.495 - stderr> Using properties file: null
> 2022-08-15 23:55:53.58 - stderr> Parsed arguments:
> 2022-08-15 23:55:53.58 - stderr> master local
> 2022-08-15 23:55:53.58 - stderr> deployMode null
> 2022-08-15 23:55:53.58 - stderr> executorMemory null
> 2022-08-15 23:55:53.581 - stderr> executorCores null
> 2022-08-15 23:55:53.581 - stderr> totalExecutorCores null
> 2022-08-15 23:55:53.581 - stderr> propertiesFile null
> 2022-08-15 23:55:53.581 - stderr> driverMemory null
> 2022-08-15 23:55:53.581 - stderr> driverCores null
> 2022-08-15 23:55:53.581 - stderr> driverExtraClassPath null
> 2022-08-15 23:55:53.581 - stderr> driverExtraLibraryPath null
> 2022-08-15 23:55:53.581 - stderr> driverExtraJavaOptions null
> 2022-08-15 23:55:53.581 - stderr> supervise false
> 2022-08-15 23:55:53.581 - stderr> queue null
> 2022-08-15 23:55:53.581 - stderr> numExecutors null
> 2022-08-15 23:55:53.581 - stderr> files null
> 2022-08-15 23:55:53.581 - stderr> pyFiles null
> 2022-08-15 23:55:53.581 - stderr> archives null
> 2022-08-15 23:55:53.581 - stderr> mainClass null
> 2022-08-15 23:55:53.581 - stderr> primaryResource file:/Users/Spark/spark-source/R/pkg/tests/fulltests/jarTest.R
> 2022-08-15 23:55:53.581 - stderr> name testApp
> 2022-08-15 23:55:53.581 - stderr> childArgs []
> 2022-08-15 23:55:53.581 - stderr> jars file:/Users/Spark/spark-source/core/target/tmp/spark-e15e960c-1c10-44cd-99d4-f1905e4c18be/sparkRTestJar-1660632952368.jar
> 2022-08-15 23:55:53.581 - stderr> packages null
> 2022-08-15 23:55:53.581 - stderr> packagesExclusions null
> 2022-08-15 23:55:53.581 - stderr> repositories null
> 2022-08-15 23:55:53.581 - stderr> verbose true
> 2022-08-15 23:55:53.581 - stderr>
> 2022-08-15 23:55:53.581 - stderr> Spark properties used, including those specified through
> 2022-08-15 23:55:53.581 - stderr> --conf and those from the properties file null:
> 2022-08-15 23:55:53.581 - stderr> (spark.ui.enabled,false)
> 2022-08-15 23:55:53.581 - stderr>
> 2022-08-15 23:55:53.581 - stderr>
> 2022-08-15 23:55:53.729 - stderr> /Users/Spark/spark-source/core/target/tmp/spark-e15e960c-1c10-44cd-99d4-f1905e4c18be/sparkRTestJar-1660632952368.jar doesn't contain R source code, skipping...
> 2022-08-15 23:55:54.058 - stderr> Main class:
> 2022-08-15 23:55:54.058 - stderr> org.apache.spark.deploy.RRunner
> 2022-08-15 23:55:54.058 - stderr> Arguments:
> 2022-08-15 23:55:54.058 - stderr> file:/Users/Spark/spark-source/R/pkg/tests/fulltests/jarTest.R
> 2022-08-15 23:55:54.06 - stderr> Spark config:
> 2022-08-15 23:55:54.06 - stderr> (spark.app.name,testApp)
> 2022-08-15 23:55:54.06 - stderr> (spark.app.submitTime,1660632954058)
> 2022-08-15 23:55:54.06 - stderr> (spark.files,file:/Users/Spark/spark-source/R/pkg/tests/fulltests/jarTest.R)
> 2022-08-15 23:55:54.06 - stderr> (spark.jars,file:///Users/Spark/spark-source/core/target/tmp/spark-e15e960c-1c10-44cd-99d4-f1905e4c18be/sparkRTestJar-1660632952368.jar)
> 2022-08-15 23:55:54.06 - stderr> (spark.master,local)
> 2022-08-15 23:55:54.06 - stderr> (spark.repl.local.jars,file:///Users/Spark/spark-source/core/target/tmp/spark-e15e960c-1c10-44cd-99d4-f1905e4c18be/sparkRTestJar-1660632952368.jar)
> 2022-08-15 23:55:54.06 - stderr> (spark.submit.deployMode,client)
> 2022-08-15 23:55:54.06 - stderr> (spark.submit.pyFiles,)
> 2022-08-15 23:55:54.06 - stderr> (spark.ui.enabled,false)
> 2022-08-15 23:55:54.06 - stderr> Classpath elements:
> 2022-08-15 23:55:54.06 - stderr> file:///Users/Spark/spark-source/core/target/tmp/spark-e15e960c-1c10-44cd-99d4-f1905e4c18be/sparkRTestJar-1660632952368.jar
> 2022-08-15 23:55:54.06 - stderr>
> 2022-08-15 23:55:54.06 - stderr>
> 2022-08-15 23:55:57.094 - stdout>
> 2022-08-15 23:55:57.094 - stdout> 载入程辑包:‘SparkR’
> 2022-08-15 23:55:57.094 - stdout>
> 2022-08-15 23:55:57.095 - stdout> The following objects are masked from ‘package:stats’:
> 2022-08-15 23:55:57.095 - stdout>
> 2022-08-15 23:55:57.095 - stdout> cov, filter, lag, na.omit, predict, sd, var, window
> 2022-08-15 23:55:57.095 - stdout>
> 2022-08-15 23:55:57.095 - stdout> The following objects are masked from ‘package:base’:
> 2022-08-15 23:55:57.095 - stdout>
> 2022-08-15 23:55:57.095 - stdout> as.data.frame, colnames, colnames<-, drop, endsWith, intersect,
> 2022-08-15 23:55:57.095 - stdout> rank, rbind, sample, startsWith, subset, summary, transform, union
> 2022-08-15 23:55:57.095 - stdout>
> 2022-08-15 23:55:57.096 - stdout> Spark package found in SPARK_HOME: /Users/Spark/spark-source
> 2022-08-15 23:55:58.954 - stdout> Error in handleErrors(returnStatus, conn) :
> 2022-08-15 23:55:58.955 - stdout> java.lang.NoClassDefFoundError: org/antlr/v4/runtime/ParserRuleContext
> 2022-08-15 23:55:58.955 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser$lzycompute(BaseSessionStateBuilder.scala:138)
> 2022-08-15 23:55:58.955 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser(BaseSessionStateBuilder.scala:137)
> 2022-08-15 23:55:58.955 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:363)
> 2022-08-15 23:55:58.955 - stdout> at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1175)
> 2022-08-15 23:55:58.955 - stdout> at org.apache.spark.sql.SparkSession.$anonfun$sessionState$2(SparkSession.scala:162)
> 2022-08-15 23:55:58.957 - stdout> at scala.Option.getOrElse(Option.scala:189)
> 2022-08-15 23:55:58.957 - stdout> at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:160)
> 2022-08-15 23:55:58.957 - stdout> at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:157)
> 2022-08-15 23:55:58.957 - stdout> at org.apache.spark.sql.api.r.SQLUtils$.$anonfun$setSparkContextSessionConf$2(SQLUtils.scala:75)
> 2022-08-15 23:55:58.957 - stdout> at org.apache.spark.sql.api.r.SQLUtils$.$anonfun$setSparkContextSessionConf$2$adapted(SQLU
> 2022-08-15 23:55:58.957 - stdout> Calls: sparkR.session -> callJStatic -> invokeJava -> handleErrors {code}
>
>
> The key error message is
> *java.lang.NoClassDefFoundError: org/antlr/v4/runtime/ParserRuleContext*
> I think this case should not be in the core module
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org