You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jorge Machado (Jira)" <ji...@apache.org> on 2020/03/28 09:51:00 UTC

[jira] [Comment Edited] (SPARK-23897) Guava version

    [ https://issues.apache.org/jira/browse/SPARK-23897?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17069352#comment-17069352 ] 

Jorge Machado edited comment on SPARK-23897 at 3/28/20, 9:50 AM:
-----------------------------------------------------------------

I think that master is actually broken at least for commit d025ddbaa7e7b9746d8e47aeed61ed39d2f09f0e. I builded with: 
{code:java}
./build/mvn  clean package  -DskipTests -Phadoop-3.2 -Pkubernetes -Phadoop-cloud
{code}
I get: 
{code:java}
jorge@Jorges-MacBook-Pro ~/Downloads/spark/dist/bin                                                                                                                                                                                                [10:43:24]jorge@Jorges-MacBook-Pro ~/Downloads/spark/dist/bin                                                                                                                                                                                                [10:43:24]> $ java -version                                                                                                                                                                                                                                 [±master ✓]java version "1.8.0_211"Java(TM) SE Runtime Environment (build 1.8.0_211-b12)Java HotSpot(TM) 64-Bit Server VM (build 25.211-b12, mixed mode) jorge@Jorges-MacBook-Pro ~/Downloads/spark/dist/bin                                                                                                                                                                                                [10:43:27]> $ ./run-example SparkPi 100                                                                                                                                                                                                                     [±master ✓]Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357) at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338) at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopHiveConfigurations(SparkHadoopUtil.scala:456) at org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:427) at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$2(SparkSubmit.scala:342) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:342) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:871) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala){code}
If I delete guava 14 and add guava28 it works.


was (Author: jomach):
I think that master is actually broken at least for commit d025ddbaa7e7b9746d8e47aeed61ed39d2f09f0e. I builded with: 
{code:java}
./build/mvn  clean package  -DskipTests -Phadoop-3.2 -Pkubernetes -Phadoop-cloud
{code}
I get: 
{code:java}
jorge@Jorges-MacBook-Pro ~/Downloads/spark/dist/bin                                                                                                                                                                                                [10:43:24]jorge@Jorges-MacBook-Pro ~/Downloads/spark/dist/bin                                                                                                                                                                                                [10:43:24]> $ java -version                                                                                                                                                                                                                                 [±master ✓]java version "1.8.0_211"Java(TM) SE Runtime Environment (build 1.8.0_211-b12)Java HotSpot(TM) 64-Bit Server VM (build 25.211-b12, mixed mode)
jorge@Jorges-MacBook-Pro ~/Downloads/spark/dist/bin                                                                                                                                                                                                [10:43:27]> $ ./run-example SparkPi 100                                                                                                                                                                                                                     [±master ✓]Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357) at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338) at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopHiveConfigurations(SparkHadoopUtil.scala:456) at org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:427) at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$2(SparkSubmit.scala:342) at scala.Option.getOrElse(Option.scala:189) at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:342) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:871) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
{code}

> Guava version
> -------------
>
>                 Key: SPARK-23897
>                 URL: https://issues.apache.org/jira/browse/SPARK-23897
>             Project: Spark
>          Issue Type: Dependency upgrade
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Sercan Karaoglu
>            Priority: Minor
>
> Guava dependency version 14 is pretty old, needs to be updated to at least 16, google cloud storage connector uses newer one which causes pretty popular error with guava; "java.lang.NoSuchMethodError: com.google.common.base.Splitter.splitToList(Ljava/lang/CharSequence;)Ljava/util/List;" and causes app to crash



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org