You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@kylin.apache.org by 周浪 <75...@qq.com> on 2019/11/30 12:16:33 UTC

kylin构建cube时,step3 连接 hive-metastore 报错【transport.TIOStreamTransport:112 : Error closing output stream. java.net.SocketException: Socket closed】

kylin version: apache-kylin-3.0.0-beta-bin-hadoop3
hive version: Apache Hive (version 3.1.0-mrs-2.0)
Beeline version: 3.1.0-mrs-2.0


构建cube的时候,第三步会报错;查看日志,发现kylin 会去连接 hive metastore ;但是连接后 metastore-server 就会报错 【org.apache.thrift.transport.TTransportException: Invalid status -128】 导致 kylin client 断开链接,无法继续构建cube。


备注:
1、单独使用beeline,执行hive-sql是正常的;
2、step1、step2 中链接 hiveserver2正常。跑mr正常。
3、集群使用的kerberos认证


有朋友遇到过类似情况吗?




1)步骤截图:





2) kylin 报错日志


2019-11-29 15:21:22,761 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] common.KylinConfig:461 : Creating new manager instance of class org.apache.kylin.source.SourceManager
2019-11-29 15:21:22,851 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] conf.HiveConf:187 : Found configuration file file:/home/fibotestadmin/config/hive-site.xml
2019-11-29 15:21:22,941 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] common.HCatUtil:652 : mapreduce.lib.hcatoutput.hive.conf not set. Generating configuration differences.
2019-11-29 15:21:22,942 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] common.HCatUtil:638 : Configuration differences={hive.server2.authentication=KERBEROS, hive.server2.authentication.kerberos.principal=hive/hadoop.0b019988_a1b0_4f31_ae11_6299a85f88ff.com@0B019988_A1B0_4F31_AE11_6299A85F88FF.COM, hive.server2.thrift.sasl.qop=auth-conf, hive.server2.authentication.kerberos.keytab=/opt/Bigdata/MRS_2.0.5/install/FusionInsight-Hive-3.1.0/hive-3.1.0/bin/hive.keytab, hive.mapred.reduce.tasks.speculative.execution=false, hive.security.authorization.enabled=true, hive.metastore.kerberos.principal=hive/hadoop.0b019988_a1b0_4f31_ae11_6299a85f88ff.com@0B019988_A1B0_4F31_AE11_6299A85F88FF.COM, hive.metastore.uris=thrift://192.168.0.224:10000,thrift://192.168.0.61:10000}
2019-11-29 15:21:22,950 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] common.HiveClientCache:119 : Initializing cache: eviction-timeout=120 initial-capacity=50 maximum-capacity=50
2019-11-29 15:21:23,014 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] metastore.HiveMetaStoreClient:526 : Trying to connect to metastore with URI thrift://node-master1BItD:10000
2019-11-29 15:21:23,037 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] metastore.HiveMetaStoreClient:602 : Opened a connection to metastore, current connections: 1
2019-11-29 15:21:23,076 WARN&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] metastore.HiveMetaStoreClient:645 : set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it.
org.apache.thrift.transport.TTransportException
	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
	at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380)
	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230)
	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4814)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4800)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:637)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init&gt;(HiveMetaStoreClient.java:228)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.<init&gt;(HiveClientCache.java:409)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init&gt;(RetryingMetaStoreClient.java:95)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:297)
	at org.apache.hive.hcatalog.common.HiveClientCache$5.call(HiveClientCache.java:292)
	at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4767)
	at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3568)
	at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2350)
	at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2313)
	at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2228)
	at com.google.common.cache.LocalCache.get(LocalCache.java:3965)
	at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4764)
	at org.apache.hive.hcatalog.common.HiveClientCache.getOrCreate(HiveClientCache.java:292)
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:267)
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:569)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:104)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:88)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
	at org.apache.kylin.source.hive.HiveMRInput$HiveTableInputFormat.configureJob(HiveMRInput.java:80)
	at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.setupMapper(FactDistinctColumnsJob.java:126)
	at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:104)
	at org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork(MapReduceExecutable.java:131)
	at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:179)
	at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:71)
	at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:179)
	at org.apache.kylin.job.impl.threadpool.DistributedScheduler$JobRunner.run(DistributedScheduler.java:110)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
2019-11-29 15:21:23,078 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] metastore.HiveMetaStoreClient:673 : Connected to metastore.
2019-11-29 15:21:23,078 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] metastore.RetryingMetaStoreClient:97 : RetryingMetaStoreClient proxy=class org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient ugi=kylin@0B019988_A1B0_4F31_AE11_6299A85F88FF.COM (auth:KERBEROS) retries=1 delay=1 lifetime=0
2019-11-29 15:21:23,245 WARN&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] metastore.HiveMetaStoreClient:445 : Evicted client has non-zero user count: 1
2019-11-29 15:21:23,245 WARN&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] metastore.HiveMetaStoreClient:496 : Non-zero user count preventing client tear down: users=1 expired=true
2019-11-29 15:21:23,248 WARN&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] transport.TIOStreamTransport:112 : Error closing output stream.
java.net.SocketException: Socket closed
	at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:118)
	at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
	at org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110)
	at org.apache.thrift.transport.TSocket.close(TSocket.java:235)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.close(HiveMetaStoreClient.java:703)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.tearDown(HiveClientCache.java:510)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.tearDownIfUnused(HiveClientCache.java:500)
	at org.apache.hive.hcatalog.common.HiveClientCache$CacheableHiveMetaStoreClient.close(HiveClientCache.java:485)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208)
	at com.sun.proxy.$Proxy69.close(Unknown Source)
	at org.apache.hive.hcatalog.common.HiveClientCache.get(HiveClientCache.java:273)
	at org.apache.hive.hcatalog.common.HCatUtil.getHiveMetastoreClient(HCatUtil.java:569)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:104)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:88)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
	at org.apache.kylin.source.hive.HiveMRInput$HiveTableInputFormat.configureJob(HiveMRInput.java:80)
	at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.setupMapper(FactDistinctColumnsJob.java:126)
	at org.apache.kylin.engine.mr.steps.FactDistinctColumnsJob.run(FactDistinctColumnsJob.java:104)
	at org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork(MapReduceExecutable.java:131)
	at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:179)
	at org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:71)
	at org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:179)
	at org.apache.kylin.job.impl.threadpool.DistributedScheduler$JobRunner.run(DistributedScheduler.java:110)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
2019-11-29 15:21:23,249 INFO&nbsp; [Scheduler 824343166 Job b9b72cd3-3e94-dfdf-a49d-66c96be4a222-111] metastore.HiveMetaStoreClient:704 : Closed a connection to metastore, current connections: 0





3)hive -metastore 报错日志:


2019-11-29 15:21:23,064 | ERROR | HiveServer2-Handler-Pool: Thread-173 | Error occurred during processing of message. | 
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Invalid status -128
	at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:694) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:691) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_212]
	at javax.security.auth.Subject.doAs(Subject.java:360) ~[?:1.8.0_212]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) ~[hadoop-common-3.1.1-mrs-2.0.jar:?]
	at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:691) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]
Caused by: org.apache.thrift.transport.TTransportException: Invalid status -128
	at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	... 10 more
2019-11-29 15:21:24,282 | ERROR | HiveServer2-Handler-Pool: Thread-173 | Error occurred during processing of message. | 
java.lang.RuntimeException: org.apache.thrift.transport.TTransportException: Invalid status -128
	at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:219) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:694) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:691) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_212]
	at javax.security.auth.Subject.doAs(Subject.java:360) ~[?:1.8.0_212]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) ~[hadoop-common-3.1.1-mrs-2.0.jar:?]
	at org.apache.hadoop.hive.metastore.security.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory.getTransport(HadoopThriftAuthBridge.java:691) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:269) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_212]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_212]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]
Caused by: org.apache.thrift.transport.TTransportException: Invalid status -128
	at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:232) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslTransport.receiveSaslMessage(TSaslTransport.java:184) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslServerTransport.handleSaslStartMessage(TSaslServerTransport.java:125) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216) ~[hive-exec-3.1.0-mrs-2.0.jar:3.1.0-mrs-2.0]
	... 10 more