You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by guoyb <86...@qq.com> on 2021/03/08 06:15:13 UTC

回复:【flink sql-client 读写 Kerberos认证的hive】

您好!
hive.metastore.sasl.enabled 是true


启动sql client的时候,可以正常读取到认证信息,并读取metastore的表名。


读和写,认证就失败了。



---原始邮件---
发件人: "Rui Li"<lirui.fudan@gmail.com&gt;
发送时间: 2021年3月8日(周一) 中午12:12
收件人: "user-zh"<user-zh@flink.apache.org&gt;;
主题: Re: 【flink sql-client 读写 Kerberos认证的hive】


Hi,

从你发的stacktrace来看,走到了set_ugi方法说明client认为server没有开启kerberos。确认一下你HiveCatalog这边指定的hive-site.xml是否配置正确呢,像hive.metastore.sasl.enabled是不是设置成true了?

On Sun, Mar 7, 2021 at 5:49 PM 861277329@qq.com <861277329@qq.com&gt; wrote:

&gt; 环境:
&gt; flink1.12.1&amp;nbsp;
&gt; hive2.1.0
&gt; CDH6.2.0
&gt;
&gt;
&gt; 【问题描述】
&gt; &amp;nbsp;在没开启Kerberos认证时,可以正常读写hive表
&gt; &amp;nbsp;
&gt; &amp;nbsp;开启Kerberos认证后,
&gt; &amp;nbsp;启动时可以正常读取到hive metastore的元数据信息,读写不了表。
&gt;
&gt;
&gt; 【sql-client.sh embedded】
&gt; Flink SQL&amp;gt; show tables;
&gt; dimension_table
&gt; dimension_table1
&gt; test
&gt;
&gt;
&gt; Flink SQL&amp;gt; select * from test;
&gt; [ERROR] Could not execute SQL statement. Reason:
&gt; org.apache.flink.connectors.hive.FlinkHiveException: Failed to collect all
&gt; partitions from hive metaStore
&gt;
&gt;
&gt; 【完整日志
&gt; /opt/cloudera/parcels/FLINK-1.12.1-BIN-SCALA_2.11/lib/flink/log/flink-root-sql-client-cdh6.com.log】
&gt;
&gt; 2021-03-07 10:29:18.776 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Trying to connect to localhost/127.0.0.1:6123
&gt; 2021-03-07 10:29:18.777 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address 'cdh6.com/192.168.31.10': Connection
&gt; refused (Connection refused)
&gt; 2021-03-07 10:29:18.778 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:18.778 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
&gt; Network is unreachable (connect failed)
&gt; 2021-03-07 10:29:18.778 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/192.168.31.10': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:18.779 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
&gt; unreachable (connect failed)
&gt; 2021-03-07 10:29:18.779 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:18.779 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
&gt; Network is unreachable (connect failed)
&gt; 2021-03-07 10:29:18.779 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/192.168.31.10': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:18.780 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
&gt; unreachable (connect failed)
&gt; 2021-03-07 10:29:18.780 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:18.780 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Could not connect. Waiting for 1600 msecs before next attempt
&gt; 2021-03-07 10:29:20.381 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Trying to connect to localhost/127.0.0.1:6123
&gt; 2021-03-07 10:29:20.381 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address 'cdh6.com/192.168.31.10': Connection
&gt; refused (Connection refused)
&gt; 2021-03-07 10:29:20.382 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:20.383 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
&gt; Network is unreachable (connect failed)
&gt; 2021-03-07 10:29:20.383 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/192.168.31.10': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:20.383 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
&gt; unreachable (connect failed)
&gt; 2021-03-07 10:29:20.383 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:20.384 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
&gt; Network is unreachable (connect failed)
&gt; 2021-03-07 10:29:20.384 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/192.168.31.10': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:20.384 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
&gt; unreachable (connect failed)
&gt; 2021-03-07 10:29:20.385 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:20.385 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Could not connect. Waiting for 1829 msecs before next attempt
&gt; 2021-03-07 10:29:22.214 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Trying to connect to localhost/127.0.0.1:6123
&gt; 2021-03-07 10:29:22.215 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address 'cdh6.com/192.168.31.10': Connection
&gt; refused (Connection refused)
&gt; 2021-03-07 10:29:22.216 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:22.216 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
&gt; Network is unreachable (connect failed)
&gt; 2021-03-07 10:29:22.217 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/192.168.31.10': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:22.217 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
&gt; unreachable (connect failed)
&gt; 2021-03-07 10:29:22.217 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:22.218 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
&gt; Network is unreachable (connect failed)
&gt; 2021-03-07 10:29:22.218 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/192.168.31.10': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:22.218 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
&gt; unreachable (connect failed)
&gt; 2021-03-07 10:29:22.218 [main] INFO&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Failed to connect from address '/127.0.0.1': Connection refused
&gt; (Connection refused)
&gt; 2021-03-07 10:29:22.219 [main] WARN&amp;nbsp; org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
&gt; - Could not connect to localhost/127.0.0.1:6123. Selecting a local
&gt; address using heuristics.
&gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.vectorized.use.checked.expressions does not exist
&gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.vectorized.use.checked.expressions does not exist
&gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.strict.checks.no.partition.filter does not exist
&gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.strict.checks.no.partition.filter does not exist
&gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.strict.checks.orderby.no.limit does not exist
&gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.strict.checks.orderby.no.limit does not exist
&gt; 2021-03-07 10:29:22.291 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.vectorized.input.format.excludes does not exist
&gt; 2021-03-07 10:29:22.291 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.vectorized.input.format.excludes does not exist
&gt; 2021-03-07 10:29:22.291 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.strict.checks.bucketing does not exist
&gt; 2021-03-07 10:29:22.291 [main] WARN&amp;nbsp;
&gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
&gt; hive.strict.checks.bucketing does not exist
&gt; 2021-03-07 10:29:22.291 [main] INFO&amp;nbsp; hive.metastore&amp;nbsp; - Trying to
&gt; connect to metastore with URI thrift://cdh6.com:9083
&gt; 2021-03-07 10:29:22.292 [main] INFO&amp;nbsp; hive.metastore&amp;nbsp; - Opened a
&gt; connection to metastore, current connections: 2
&gt; 2021-03-07 10:29:22.301 [main] WARN&amp;nbsp; hive.metastore&amp;nbsp; - set_ugi()
&gt; not successful, Likely cause: new client talking to old server. Continuing
&gt; without it.
&gt; org.apache.thrift.transport.TTransportException: null
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4122)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4108)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:495)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init&amp;gt;(HiveMetaStoreClient.java:286)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init&amp;gt;(HiveMetaStoreClient.java:211)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; java.lang.reflect.Constructor.newInstance(Constructor.java:423)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init&amp;gt;(RetryingMetaStoreClient.java:83)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; java.lang.reflect.Method.invoke(Method.java:498)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.catalog.hive.client.HiveShimV200.getHiveMetastoreClient(HiveShimV200.java:54)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:274)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init&amp;gt;(HiveMetastoreClientWrapper.java:80)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:32)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.connectors.hive.util.HivePartitionUtils.getAllPartitions(HivePartitionUtils.java:114)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.connectors.hive.HiveTableSource.getDataStream(HiveTableSource.java:137)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.connectors.hive.HiveTableSource$1.produceDataStream(HiveTableSource.java:122)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.common.CommonPhysicalTableSourceScan.createSourceTransformation(CommonPhysicalTableSourceScan.scala:88)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecTableSourceScan.translateToPlanInternal(StreamExecTableSourceScan.scala:91)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecTableSourceScan.translateToPlanInternal(StreamExecTableSourceScan.scala:44)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.exec.ExecNode$class.translateToPlan(ExecNode.scala:59)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecTableSourceScan.translateToPlan(StreamExecTableSourceScan.scala:44)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToTransformation(StreamExecLegacySink.scala:158)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:82)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:48)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.exec.ExecNode$class.translateToPlan(ExecNode.scala:59)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlan(StreamExecLegacySink.scala:48)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:66)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:65)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; scala.collection.Iterator$class.foreach(Iterator.scala:891)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; scala.collection.AbstractIterable.foreach(Iterable.scala:54)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; scala.collection.AbstractTraversable.map(Traversable.scala:104)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:65)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:167)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.api.internal.TableEnvironmentImpl.translateAndClearBuffer(TableEnvironmentImpl.java:1321)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.getPipeline(StreamTableEnvironmentImpl.java:328)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$createPipeline$1(ExecutionContext.java:287)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:256)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.gateway.local.ExecutionContext.createPipeline(ExecutionContext.java:282)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.gateway.local.LocalExecutor.executeQueryInternal(LocalExecutor.java:542)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:374)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:648)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.cli.CliClient.callCommand(CliClient.java:323)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; java.util.Optional.ifPresent(Optional.java:159)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.cli.CliClient.open(CliClient.java:214)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:144)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.SqlClient.start(SqlClient.java:115)
&gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
&gt; org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
&gt; 2021-03-07 10:29:22.302 [main] INFO&amp;nbsp; hive.metastore&amp;nbsp; - Connected
&gt; to metastore.
&gt;
&gt;
&gt;
&gt; 861277329@qq.com
&gt;


-- 
Best regards!
Rui Li

Re: 【flink sql-client 读写 Kerberos认证的hive】

Posted by Rui Li <li...@gmail.com>.
那应该就是跟https://issues.apache.org/jira/browse/FLINK-20913
 有关了,这个issue是1.12.2修复的,可以升级一下试试。

On Mon, Mar 8, 2021 at 2:15 PM guoyb <86...@qq.com> wrote:

> 您好!
> hive.metastore.sasl.enabled 是true
>
>
> 启动sql client的时候,可以正常读取到认证信息,并读取metastore的表名。
>
>
> 读和写,认证就失败了。
>
>
>
> ---原始邮件---
> 发件人: "Rui Li"<lirui.fudan@gmail.com&gt;
> 发送时间: 2021年3月8日(周一) 中午12:12
> 收件人: "user-zh"<user-zh@flink.apache.org&gt;;
> 主题: Re: 【flink sql-client 读写 Kerberos认证的hive】
>
>
> Hi,
>
>
> 从你发的stacktrace来看,走到了set_ugi方法说明client认为server没有开启kerberos。确认一下你HiveCatalog这边指定的hive-site.xml是否配置正确呢,像hive.metastore.sasl.enabled是不是设置成true了?
>
> On Sun, Mar 7, 2021 at 5:49 PM 861277329@qq.com <861277329@qq.com&gt;
> wrote:
>
> &gt; 环境:
> &gt; flink1.12.1&amp;nbsp;
> &gt; hive2.1.0
> &gt; CDH6.2.0
> &gt;
> &gt;
> &gt; 【问题描述】
> &gt; &amp;nbsp;在没开启Kerberos认证时,可以正常读写hive表
> &gt; &amp;nbsp;
> &gt; &amp;nbsp;开启Kerberos认证后,
> &gt; &amp;nbsp;启动时可以正常读取到hive metastore的元数据信息,读写不了表。
> &gt;
> &gt;
> &gt; 【sql-client.sh embedded】
> &gt; Flink SQL&amp;gt; show tables;
> &gt; dimension_table
> &gt; dimension_table1
> &gt; test
> &gt;
> &gt;
> &gt; Flink SQL&amp;gt; select * from test;
> &gt; [ERROR] Could not execute SQL statement. Reason:
> &gt; org.apache.flink.connectors.hive.FlinkHiveException: Failed to
> collect all
> &gt; partitions from hive metaStore
> &gt;
> &gt;
> &gt; 【完整日志
> &gt;
> /opt/cloudera/parcels/FLINK-1.12.1-BIN-SCALA_2.11/lib/flink/log/flink-root-sql-client-cdh6.com.log】
> &gt;
> &gt; 2021-03-07 10:29:18.776 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Trying to connect to localhost/127.0.0.1:6123
> &gt; 2021-03-07 10:29:18.777 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address 'cdh6.com/192.168.31.10': Connection
> &gt; refused (Connection refused)
> &gt; 2021-03-07 10:29:18.778 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:18.778 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address
> '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
> &gt; Network is unreachable (connect failed)
> &gt; 2021-03-07 10:29:18.778 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/192.168.31.10': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:18.779 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
> &gt; unreachable (connect failed)
> &gt; 2021-03-07 10:29:18.779 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:18.779 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address
> '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
> &gt; Network is unreachable (connect failed)
> &gt; 2021-03-07 10:29:18.779 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/192.168.31.10': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:18.780 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
> &gt; unreachable (connect failed)
> &gt; 2021-03-07 10:29:18.780 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:18.780 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Could not connect. Waiting for 1600 msecs before next attempt
> &gt; 2021-03-07 10:29:20.381 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Trying to connect to localhost/127.0.0.1:6123
> &gt; 2021-03-07 10:29:20.381 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address 'cdh6.com/192.168.31.10': Connection
> &gt; refused (Connection refused)
> &gt; 2021-03-07 10:29:20.382 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:20.383 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address
> '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
> &gt; Network is unreachable (connect failed)
> &gt; 2021-03-07 10:29:20.383 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/192.168.31.10': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:20.383 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
> &gt; unreachable (connect failed)
> &gt; 2021-03-07 10:29:20.383 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:20.384 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address
> '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
> &gt; Network is unreachable (connect failed)
> &gt; 2021-03-07 10:29:20.384 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/192.168.31.10': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:20.384 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
> &gt; unreachable (connect failed)
> &gt; 2021-03-07 10:29:20.385 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:20.385 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Could not connect. Waiting for 1829 msecs before next attempt
> &gt; 2021-03-07 10:29:22.214 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Trying to connect to localhost/127.0.0.1:6123
> &gt; 2021-03-07 10:29:22.215 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address 'cdh6.com/192.168.31.10': Connection
> &gt; refused (Connection refused)
> &gt; 2021-03-07 10:29:22.216 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:22.216 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address
> '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
> &gt; Network is unreachable (connect failed)
> &gt; 2021-03-07 10:29:22.217 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/192.168.31.10': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:22.217 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
> &gt; unreachable (connect failed)
> &gt; 2021-03-07 10:29:22.217 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:22.218 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address
> '/fe80:0:0:0:20c:29ff:fea1:6d6b%ens33':
> &gt; Network is unreachable (connect failed)
> &gt; 2021-03-07 10:29:22.218 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/192.168.31.10': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:22.218 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/0:0:0:0:0:0:0:1%lo': Network is
> &gt; unreachable (connect failed)
> &gt; 2021-03-07 10:29:22.218 [main] INFO&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Failed to connect from address '/127.0.0.1': Connection refused
> &gt; (Connection refused)
> &gt; 2021-03-07 10:29:22.219 [main] WARN&amp;nbsp;
> org.apache.flink.runtime.net.ConnectionUtils&amp;nbsp;
> &gt; - Could not connect to localhost/127.0.0.1:6123. Selecting a local
> &gt; address using heuristics.
> &gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.vectorized.use.checked.expressions does not exist
> &gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.vectorized.use.checked.expressions does not exist
> &gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.strict.checks.no.partition.filter does not exist
> &gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.strict.checks.no.partition.filter does not exist
> &gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.strict.checks.orderby.no.limit does not exist
> &gt; 2021-03-07 10:29:22.290 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.strict.checks.orderby.no.limit does not exist
> &gt; 2021-03-07 10:29:22.291 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.vectorized.input.format.excludes does not exist
> &gt; 2021-03-07 10:29:22.291 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.vectorized.input.format.excludes does not exist
> &gt; 2021-03-07 10:29:22.291 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.strict.checks.bucketing does not exist
> &gt; 2021-03-07 10:29:22.291 [main] WARN&amp;nbsp;
> &gt; org.apache.hadoop.hive.conf.HiveConf&amp;nbsp; - HiveConf of name
> &gt; hive.strict.checks.bucketing does not exist
> &gt; 2021-03-07 10:29:22.291 [main] INFO&amp;nbsp;
> hive.metastore&amp;nbsp; - Trying to
> &gt; connect to metastore with URI thrift://cdh6.com:9083
> &gt; 2021-03-07 10:29:22.292 [main] INFO&amp;nbsp;
> hive.metastore&amp;nbsp; - Opened a
> &gt; connection to metastore, current connections: 2
> &gt; 2021-03-07 10:29:22.301 [main] WARN&amp;nbsp;
> hive.metastore&amp;nbsp; - set_ugi()
> &gt; not successful, Likely cause: new client talking to old server.
> Continuing
> &gt; without it.
> &gt; org.apache.thrift.transport.TTransportException: null
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4122)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4108)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:495)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init&amp;gt;(HiveMetaStoreClient.java:286)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init&amp;gt;(HiveMetaStoreClient.java:211)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init&amp;gt;(RetryingMetaStoreClient.java:83)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:118)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; java.lang.reflect.Method.invoke(Method.java:498)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.catalog.hive.client.HiveShimV200.getHiveMetastoreClient(HiveShimV200.java:54)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:274)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.<init&amp;gt;(HiveMetastoreClientWrapper.java:80)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:32)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.connectors.hive.util.HivePartitionUtils.getAllPartitions(HivePartitionUtils.java:114)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.connectors.hive.HiveTableSource.getDataStream(HiveTableSource.java:137)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.connectors.hive.HiveTableSource$1.produceDataStream(HiveTableSource.java:122)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.common.CommonPhysicalTableSourceScan.createSourceTransformation(CommonPhysicalTableSourceScan.scala:88)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecTableSourceScan.translateToPlanInternal(StreamExecTableSourceScan.scala:91)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecTableSourceScan.translateToPlanInternal(StreamExecTableSourceScan.scala:44)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.exec.ExecNode$class.translateToPlan(ExecNode.scala:59)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecTableSourceScan.translateToPlan(StreamExecTableSourceScan.scala:44)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToTransformation(StreamExecLegacySink.scala:158)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:82)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlanInternal(StreamExecLegacySink.scala:48)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.exec.ExecNode$class.translateToPlan(ExecNode.scala:59)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.plan.nodes.physical.stream.StreamExecLegacySink.translateToPlan(StreamExecLegacySink.scala:48)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:66)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$translateToPlan$1.apply(StreamPlanner.scala:65)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; scala.collection.Iterator$class.foreach(Iterator.scala:891)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; scala.collection.AbstractTraversable.map(Traversable.scala:104)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:65)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:167)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1329)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.api.internal.TableEnvironmentImpl.translateAndClearBuffer(TableEnvironmentImpl.java:1321)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.getPipeline(StreamTableEnvironmentImpl.java:328)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$createPipeline$1(ExecutionContext.java:287)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:256)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.client.gateway.local.ExecutionContext.createPipeline(ExecutionContext.java:282)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeQueryInternal(LocalExecutor.java:542)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:374)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:648)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt;
> org.apache.flink.table.client.cli.CliClient.callCommand(CliClient.java:323)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; java.util.Optional.ifPresent(Optional.java:159)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; org.apache.flink.table.client.cli.CliClient.open(CliClient.java:214)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:144)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; org.apache.flink.table.client.SqlClient.start(SqlClient.java:115)
> &gt; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; at
> &gt; org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
> &gt; 2021-03-07 10:29:22.302 [main] INFO&amp;nbsp;
> hive.metastore&amp;nbsp; - Connected
> &gt; to metastore.
> &gt;
> &gt;
> &gt;
> &gt; 861277329@qq.com
> &gt;
>
>
> --
> Best regards!
> Rui Li



-- 
Best regards!
Rui Li