You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by 董海峰(Sharp) <Sh...@aishu.cn> on 2021/01/23 07:40:22 UTC

Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

Hi,您好啊,我最近遇到一个问题,在社区里发过,但是没人回答,想请教您一下,烦请有空的时候回复一下,谢谢您啦。
hadoop3.3.0 flink1.12 hive3.12
I want to integrate hive and flink. After I configure the sql-client-dqfaults.yaml file,
catalogs:
   - name: default_catalog
     type: hive
     hive-conf-dir: /cdc/apache-hive-3.1.2-bin/conf

I start the flink sql client, but the following error is reported.
[root@dhf4 bin]# ./sql-client.sh embedded
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/cdc/flink-1.12.0/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/cdc/hadoop-3.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
No default environment specified.
Searching for '/cdc/flink-1.12.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/cdc/flink-1.12.0/conf/sql-client-defaults.yaml
No session environment specified.
2021-01-20 10:12:38,179 INFO  org.apache.hadoop.hive.conf.HiveConf                         [] - Found configuration file file:/cdc/apache-hive-3.1.2-bin/conf/hive-site.xml
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
    at org.apache.flink.table.client.SqlClient.main(SqlClient.java:208)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
    at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878)
    at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226)
    at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
    at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196)
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380)
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361)
    at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
    at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
    at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
    at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109)
    at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211)
    at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164)
    at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:384)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634)
    at java.util.HashMap.forEach(HashMap.java:1289)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185)
    at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138)
    at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867)
... 3 more

The log content is as follows
[root@dhf4 bin]# cat ../log/flink-root-sql-client-dhf4.log
2021-01-20 10:12:36,246 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: jobmanager.rpc.address, localhost
2021-01-20 10:12:36,252 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: jobmanager.rpc.port, 6123
2021-01-20 10:12:36,252 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: jobmanager.memory.process.size, 1600m
2021-01-20 10:12:36,252 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: taskmanager.memory.process.size, 1728m
2021-01-20 10:12:36,252 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: taskmanager.numberOfTaskSlots, 1
2021-01-20 10:12:36,256 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: parallelism.default, 1
2021-01-20 10:12:36,256 INFO  org.apache.flink.configuration.GlobalConfiguration           [] - Loading configuration property: jobmanager.execution.failover-strategy, region
2021-01-20 10:12:36,394 INFO  org.apache.flink.table.client.gateway.local.LocalExecutor    [] - Using default environment file: file:/cdc/flink-1.12.0/conf/sql-client-defaults.yaml
2021-01-20 10:12:36,754 INFO  org.apache.flink.table.client.config.entries.ExecutionEntry  [] - Property 'execution.restart-strategy.type' not specified. Using default value: fallback
2021-01-20 10:12:38,179 INFO  org.apache.hadoop.hive.conf.HiveConf                         [] - Found configuration file file:/cdc/apache-hive-3.1.2-bin/conf/hive-site.xml
2021-01-20 10:12:38,404 ERROR org.apache.flink.table.client.SqlClient                      [] - SQL Client must stop. Unexpected exception. This is a bug. Please consider filing an issue.
org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
    at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196) [flink-sql-client_2.11-1.12.0.jar:1.12.0]
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380) ~[hadoop-common-3.3.0.jar:?]
    at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361) ~[hadoop-common-3.3.0.jar:?]
    at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
    at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
    at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
    at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141) ~[flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar:1.12.0]
    at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109) ~[flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
    at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
    at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:384) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_272]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
    at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
... 3 more

I have tried many solutions, such as Guava The versions are the same, but none of them work, is there any other solution?

附上问题链接
https://stackoverflow.com/questions/65770190/caused-by-java-lang-nosuchmethoderror-com-google-common-base-preconditions-che


________________________________
免责声明

本邮件及其附件可能包含私有的、保密的或特权的交流、工作成果或其它信息。除非得到上海爱数信息技术股份有限公司的书面授权,任何披露、复制、分发或使用本邮件和/或附件中的任何内容都是不被允许的。如果您误收了本邮件,请立即通过邮件(its@aishu.cn)或电话(021-54222601)联系我们,并删除本邮件及其附件(无论电子版或打印版),谢谢!

This message and its attachments may contain communications, work product or other information which are private, confidential or privileged. Any disclosure, coping, distribution and use of the contents of this message and/or its attachments is prohibited unless specifically authorized by the EISOO in writing, If you find that you are not one of the intended recipients of this message, please immediately contact us by e-mail (its@aishu.cn) or by telephone (021-54222601) and delete this message and all of its attachments whether in electronic or in hard copy format. Thank you.

Re: Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

Posted by yang nick <bd...@gmail.com>.
应该是guava包冲突问题,请参考这篇文章(参考)
https://blog.csdn.net/u012121587/article/details/103903162


董海峰(Sharp) <Sh...@aishu.cn> 于2021年1月24日周日 上午9:11写道:

> Hi,您好啊,我最近遇到一个问题,在社区里发过,但是没人回答,想请教您一下,烦请有空的时候回复一下,谢谢您啦。
> hadoop3.3.0 flink1.12 hive3.12
> I want to integrate hive and flink. After I configure the
> sql-client-dqfaults.yaml file,
> catalogs:
>    - name: default_catalog
>      type: hive
>      hive-conf-dir: /cdc/apache-hive-3.1.2-bin/conf
>
> I start the flink sql client, but the following error is reported.
> [root@dhf4 bin]# ./sql-client.sh embedded
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/cdc/flink-1.12.0/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/cdc/hadoop-3.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> SLF4J: Actual binding is of type
> [org.apache.logging.slf4j.Log4jLoggerFactory]
> No default environment specified.
> Searching for '/cdc/flink-1.12.0/conf/sql-client-defaults.yaml'...found.
> Reading default environment from:
> file:/cdc/flink-1.12.0/conf/sql-client-defaults.yaml
> No session environment specified.
> 2021-01-20 10:12:38,179 INFO  org.apache.hadoop.hive.conf.HiveConf
>                  [] - Found configuration file
> file:/cdc/apache-hive-3.1.2-bin/conf/hive-site.xml
> Exception in thread "main"
> org.apache.flink.table.client.SqlClientException: Unexpected exception.
> This is a bug. Please consider filing an issue.
>     at org.apache.flink.table.client.SqlClient.main(SqlClient.java:208)
> Caused by: org.apache.flink.table.client.gateway.SqlExecutionException:
> Could not create execution context.
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878)
>     at
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226)
>     at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
>     at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196)
> Caused by: java.lang.NoSuchMethodError:
> com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
>     at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380)
>     at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361)
>     at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
>     at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
>     at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
>     at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
>     at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109)
>     at
> org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211)
>     at
> org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164)
>     at
> org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:384)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634)
>     at java.util.HashMap.forEach(HashMap.java:1289)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138)
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867)
> ... 3 more
>
> The log content is as follows
> [root@dhf4 bin]# cat ../log/flink-root-sql-client-dhf4.log
> 2021-01-20 10:12:36,246 INFO
> org.apache.flink.configuration.GlobalConfiguration           [] - Loading
> configuration property: jobmanager.rpc.address, localhost
> 2021-01-20 10:12:36,252 INFO
> org.apache.flink.configuration.GlobalConfiguration           [] - Loading
> configuration property: jobmanager.rpc.port, 6123
> 2021-01-20 10:12:36,252 INFO
> org.apache.flink.configuration.GlobalConfiguration           [] - Loading
> configuration property: jobmanager.memory.process.size, 1600m
> 2021-01-20 10:12:36,252 INFO
> org.apache.flink.configuration.GlobalConfiguration           [] - Loading
> configuration property: taskmanager.memory.process.size, 1728m
> 2021-01-20 10:12:36,252 INFO
> org.apache.flink.configuration.GlobalConfiguration           [] - Loading
> configuration property: taskmanager.numberOfTaskSlots, 1
> 2021-01-20 10:12:36,256 INFO
> org.apache.flink.configuration.GlobalConfiguration           [] - Loading
> configuration property: parallelism.default, 1
> 2021-01-20 10:12:36,256 INFO
> org.apache.flink.configuration.GlobalConfiguration           [] - Loading
> configuration property: jobmanager.execution.failover-strategy, region
> 2021-01-20 10:12:36,394 INFO
> org.apache.flink.table.client.gateway.local.LocalExecutor    [] - Using
> default environment file:
> file:/cdc/flink-1.12.0/conf/sql-client-defaults.yaml
> 2021-01-20 10:12:36,754 INFO
> org.apache.flink.table.client.config.entries.ExecutionEntry  [] - Property
> 'execution.restart-strategy.type' not specified. Using default value:
> fallback
> 2021-01-20 10:12:38,179 INFO  org.apache.hadoop.hive.conf.HiveConf
>                  [] - Found configuration file
> file:/cdc/apache-hive-3.1.2-bin/conf/hive-site.xml
> 2021-01-20 10:12:38,404 ERROR org.apache.flink.table.client.SqlClient
>                 [] - SQL Client must stop. Unexpected exception. This is a
> bug. Please consider filing an issue.
> org.apache.flink.table.client.gateway.SqlExecutionException: Could not
> create execution context.
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196)
> [flink-sql-client_2.11-1.12.0.jar:1.12.0]
> Caused by: java.lang.NoSuchMethodError:
> com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
>     at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380)
> ~[hadoop-common-3.3.0.jar:?]
>     at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361)
> ~[hadoop-common-3.3.0.jar:?]
>     at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
> ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
>     at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
> ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
>     at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
> ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
>     at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
> ~[flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar:1.12.0]
>     at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109)
> ~[flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211)
> ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164)
> ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89)
> ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:384)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_272]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
>     at
> org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867)
> ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
> ... 3 more
>
> I have tried many solutions, such as Guava The versions are the same, but
> none of them work, is there any other solution?
>
> 附上问题链接
>
> https://stackoverflow.com/questions/65770190/caused-by-java-lang-nosuchmethoderror-com-google-common-base-preconditions-che
>
>
> ________________________________
> 免责声明
>
>
> 本邮件及其附件可能包含私有的、保密的或特权的交流、工作成果或其它信息。除非得到上海爱数信息技术股份有限公司的书面授权,任何披露、复制、分发或使用本邮件和/或附件中的任何内容都是不被允许的。如果您误收了本邮件,请立即通过邮件(
> its@aishu.cn)或电话(021-54222601)联系我们,并删除本邮件及其附件(无论电子版或打印版),谢谢!
>
> This message and its attachments may contain communications, work product
> or other information which are private, confidential or privileged. Any
> disclosure, coping, distribution and use of the contents of this message
> and/or its attachments is prohibited unless specifically authorized by the
> EISOO in writing, If you find that you are not one of the intended
> recipients of this message, please immediately contact us by e-mail (
> its@aishu.cn) or by telephone (021-54222601) and delete this message and
> all of its attachments whether in electronic or in hard copy format. Thank
> you.
>

Re: Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

Posted by yujianbo <15...@163.com>.
参考大佬Rui Li的建议,我解决了,想参考的可以看看这个:
https://blog.csdn.net/weixin_44500374/article/details/113244560

https://www.jianshu.com/p/f076a4f66527



--
Sent from: http://apache-flink.147419.n8.nabble.com/

Re: Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

Posted by yujianbo <15...@163.com>.
Rui Li
上午好,能帮我看一下这个http://apache-flink.147419.n8.nabble.com/flink-flink-lib-td10518.html也是由依赖冲突引起的吗。我的情况是这样的,一样的集群,之前是cdh的hadoop3.0.0,hive是2.2.0;现在准备从cdh迁回社区版的hadoop集群,变成3.3.0,hive是3.1.2,昨天解决了hive-exec的问题,但是今天同样的一份代码在上一个集群是可以正常提交正常跑的,但是放到新的集群这边跑,启动的时候就直接说加载不到lib包下的一个jar,就有点奇怪,是yarn冲突了吗?



--
Sent from: http://apache-flink.147419.n8.nabble.com/

Re: Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

Posted by Rui Li <li...@gmail.com>.
Hi,

估计是Hadoop跟hive的guava版本冲突,Hadoop-3.3依赖的版本是27 [1],hive-3.1.2依赖的版本是19
[2]。另外请注意hive-3.1.2依赖的Hadoop版本是3.1.0 [3],一般不建议runtime的Hadoop版本高于hive依赖的版本。

解决方案一是在hive-exec里对guava做relocation,这个需要自己手动给hive-exec重新打包。
另一个办法是降低Hadoop版本,这里不一定需要降低集群的Hadoop版本,而是仅仅降低flink和hive这边用到的Hadoop版本,相当于用老的Hadoop
client去访问新的Hadoop server,这个小版本的兼容性一般来说是没问题的。

[1] https://issues.apache.org/jira/browse/HADOOP-16210
[2] https://github.com/apache/hive/blob/rel/release-3.1.2/pom.xml#L147
[3] https://github.com/apache/hive/blob/rel/release-3.1.2/pom.xml#L150

On Mon, Jan 25, 2021 at 2:12 PM yujianbo <15...@163.com> wrote:

> 请教一下大佬后来如何解决,我的hadoop和hive版本跟您一致。
>
>
>
> --
> Sent from: http://apache-flink.147419.n8.nabble.com/
>


-- 
Best regards!
Rui Li

Re: Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

Posted by yujianbo <15...@163.com>.
请教一下大佬后来如何解决,我的hadoop和hive版本跟您一致。



--
Sent from: http://apache-flink.147419.n8.nabble.com/

Re: Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

Posted by yujianbo <15...@163.com>.
我的已经解决了,根据大佬Rui Li的建议,也可以参考我的方式:
https://blog.csdn.net/weixin_44500374/article/details/113244560

https://www.jianshu.com/p/f076a4f66527





--
Sent from: http://apache-flink.147419.n8.nabble.com/