You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Martijn Visser (Jira)" <ji...@apache.org> on 2022/03/23 14:35:00 UTC

[jira] [Updated] (FLINK-26827) FlinkSQL和hive整合报错

     [ https://issues.apache.org/jira/browse/FLINK-26827?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Martijn Visser updated FLINK-26827:
-----------------------------------
    Priority: Major  (was: Blocker)

> FlinkSQL和hive整合报错
> -----------------
>
>                 Key: FLINK-26827
>                 URL: https://issues.apache.org/jira/browse/FLINK-26827
>             Project: Flink
>          Issue Type: Bug
>          Components: Table SQL / API
>    Affects Versions: 1.13.3
>         Environment: 环境:cdh6.2.1 linux系统,j d k1.8
>            Reporter: zhushifeng
>            Priority: Major
>
> HIVE2.1  Flink1.13.3 FlinkCDC2.1 按照官网整合报错如下:
> Flink SQL> select * from  rptdata.basic_xhsys_user ;
> Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
>         at org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:201)
>         at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
> Caused by: java.lang.ExceptionInInitializerError
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:348)
>         at org.apache.flink.connectors.hive.HiveSourceFileEnumerator.createMRSplits(HiveSourceFileEnumerator.java:94)
>         at org.apache.flink.connectors.hive.HiveSourceFileEnumerator.createInputSplits(HiveSourceFileEnumerator.java:71)
>         at org.apache.flink.connectors.hive.HiveTableSource.lambda$getDataStream$1(HiveTableSource.java:212)
>         at org.apache.flink.connectors.hive.HiveParallelismInference.logRunningTime(HiveParallelismInference.java:107)
>         at org.apache.flink.connectors.hive.HiveParallelismInference.infer(HiveParallelismInference.java:95)
>         at org.apache.flink.connectors.hive.HiveTableSource.getDataStream(HiveTableSource.java:207)
>         at org.apache.flink.connectors.hive.HiveTableSource$1.produceDataStream(HiveTableSource.java:123)
>         at org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecTableSourceScan.translateToPlanInternal(CommonExecTableSourceScan.java:96)
>         at org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
>         at org.apache.flink.table.planner.plan.nodes.exec.ExecEdge.translateToPlan(ExecEdge.java:247)
>         at org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translateToPlanInternal(StreamExecSink.java:114)
>         at org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
>         at org.apache.flink.table.planner.delegation.StreamPlanner.$anonfun$translateToPlan$1(StreamPlanner.scala:70)
>         at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233)
>         at scala.collection.Iterator.foreach(Iterator.scala:937)
>         at scala.collection.Iterator.foreach$(Iterator.scala:937)
>         at scala.collection.AbstractIterator.foreach(Iterator.scala:1425)
>         at scala.collection.IterableLike.foreach(IterableLike.scala:70)
>         at scala.collection.IterableLike.foreach$(IterableLike.scala:69)
>         at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
>         at scala.collection.TraversableLike.map(TraversableLike.scala:233)
>         at scala.collection.TraversableLike.map$(TraversableLike.scala:226)
>         at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>         at org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:69)
>         at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:165)
>         at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1518)
>         at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeQueryOperation(TableEnvironmentImpl.java:791)
>         at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1225)
>         at org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:213)
>         at org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:90)
>         at org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:213)
>         at org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:235)
>         at org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:479)
>         at org.apache.flink.table.client.cli.CliClient.callOperation(CliClient.java:412)
>         at org.apache.flink.table.client.cli.CliClient.lambda$executeStatement$0(CliClient.java:327)
>         at java.util.Optional.ifPresent(Optional.java:159)
>         at org.apache.flink.table.client.cli.CliClient.executeStatement(CliClient.java:327)
>         at org.apache.flink.table.client.cli.CliClient.executeInteractive(CliClient.java:297)
>         at org.apache.flink.table.client.cli.CliClient.executeInInteractiveMode(CliClient.java:221)
>         at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:151)
>         at org.apache.flink.table.client.SqlClient.start(SqlClient.java:95)
>         at org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
>         ... 1 more
> Caused by: java.lang.RuntimeException: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.0.0-cdh6.2.1
>         at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:102)
>         at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat.<clinit>(OrcInputFormat.java:161)
>         ... 45 more
> Caused by: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 3.0.0-cdh6.2.1
>         at org.apache.hadoop.hive.shims.ShimLoader.getMajorVersion(ShimLoader.java:177)
>         at org.apache.hadoop.hive.shims.ShimLoader.loadShims(ShimLoader.java:144)
>         at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:99)
>         ... 46 more
> Shutting down the session...
> done.
>  
> 操作:
> 拷贝的jar:
> // Flink's Hive connector
>        flink-connector-hive_2.11-1.13.3.jar
>        // Hive dependencies
>        hive-exec-2.1.0.jar. ==    hive-exec-2.1.1-cdh6.2.1.jar
>        // add antlr-runtime if you need to use hive dialect
>        antlr-runtime-3.5.2.jar
>  
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)