You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@submarine.apache.org by "Kevin Su (Jira)" <ji...@apache.org> on 2022/02/22 04:22:00 UTC

[jira] [Updated] (SUBMARINE-705) Unable to run tests in spark-security with spark-3.0 + hadoop-3.2 profiles

     [ https://issues.apache.org/jira/browse/SUBMARINE-705?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kevin Su updated SUBMARINE-705:
-------------------------------
    Target Version: 0.8.0  (was: 0.7.0)

> Unable to run tests in spark-security with spark-3.0 + hadoop-3.2 profiles 
> ---------------------------------------------------------------------------
>
>                 Key: SUBMARINE-705
>                 URL: https://issues.apache.org/jira/browse/SUBMARINE-705
>             Project: Apache Submarine
>          Issue Type: Bug
>          Components: Security
>            Reporter: Igor Calabria
>            Priority: Minor
>             Fix For: 0.7.0
>
>
> {code:java}
> mvn -pl :submarine-spark-security clean test -Pranger-2.0 -Pspark-3.0  -Phadoop-3.2
> {code}
> Fails with 
> {code:java}
> *** RUN ABORTED ***
>   java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
>   at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
>   at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
>   at org.apache.spark.deploy.SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopHiveConfigurations(SparkHadoopUtil.scala:456)
>   at org.apache.spark.deploy.SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:427)
>   at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:122)
>   at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:49)
>   at org.apache.spark.deploy.SparkHadoopUtil$.instance$lzycompute(SparkHadoopUtil.scala:397)
>   at org.apache.spark.deploy.SparkHadoopUtil$.instance(SparkHadoopUtil.scala:397)
>   at org.apache.spark.deploy.SparkHadoopUtil$.get(SparkHadoopUtil.scala:418)
>   at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:95){code}
> This is most likely caused by a guava version conflict



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@submarine.apache.org
For additional commands, e-mail: dev-help@submarine.apache.org