You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/09/27 17:46:00 UTC

[jira] [Commented] (SPARK-36864) guava version mismatch with hadoop-aws

    [ https://issues.apache.org/jira/browse/SPARK-36864?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17420935#comment-17420935 ] 

Apache Spark commented on SPARK-36864:
--------------------------------------

User 'warrenzhu25' has created a pull request for this issue:
https://github.com/apache/spark/pull/34117

> guava version mismatch with hadoop-aws
> --------------------------------------
>
>                 Key: SPARK-36864
>                 URL: https://issues.apache.org/jira/browse/SPARK-36864
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 3.1.2
>            Reporter: Zhongwei Zhu
>            Priority: Minor
>
> When use hadoop-aws 3.2 with spark 3.0, got below error. This is caused by guava version mismatch as hadoop used guava 27.0-jre while spark used 14.0.1.
> Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;Ljava/lang/Object;)V
>  at org.apache.hadoop.fs.s3a.S3AUtils.lookupPassword(S3AUtils.java:742)
>  at org.apache.hadoop.fs.s3a.S3AUtils.getAWSAccessKeys(S3AUtils.java:712)
>  at org.apache.hadoop.fs.s3a.S3AUtils.createAWSCredentialProviderSet(S3AUtils.java:559)
>  at org.apache.hadoop.fs.s3a.DefaultS3ClientFactory.createS3Client(DefaultS3ClientFactory.java:52)
>  at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:264)
>  at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3303)
>  at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
>  at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
>  at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
>  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
>  at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1853)
>  at org.apache.spark.deploy.history.EventLogFileWriter.<init>(EventLogFileWriters.scala:60)
>  at org.apache.spark.deploy.history.SingleEventLogFileWriter.<init>(EventLogFileWriters.scala:213)
>  at org.apache.spark.deploy.history.EventLogFileWriter$.apply(EventLogFileWriters.scala:181)
>  at org.apache.spark.scheduler.EventLoggingListener.<init>(EventLoggingListener.scala:66)
>  at org.apache.spark.SparkContext.<init>(SparkContext.scala:584)
>  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2588)
>  at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:937)
>  at scala.Option.getOrElse(Option.scala:189)
>  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:931)
>  at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:30)
>  at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498)
>  at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>  at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:944)
>  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
>  at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
>  at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
>  at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1023)
>  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1032)
>  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org