You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "liuxiuyuan (Jira)" <ji...@apache.org> on 2020/03/09 03:57:00 UTC

[jira] [Updated] (SPARK-31084) spark on k8s Exception "Database xxx not found" when hive MetaStoreClient lost connection

     [ https://issues.apache.org/jira/browse/SPARK-31084?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

liuxiuyuan updated SPARK-31084:
-------------------------------
    Description: 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke:184 | MetaStoreClient lost connection. Attempting to reconnect.org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
06-03-2020 12:56:09 CST stage1 INFO - Caused by: java.net.SocketException: Connection reset
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 	at java.net.SocketInputStream.read(SocketInputStream.java:210)
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 	at java.net.SocketInputStream.read(SocketInputStream.java:141)
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 	... 70 more

06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:06 [INFO] -- org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open:376 | Trying to connect to metastore with URI thrift://hive-metastore-server:9083
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:06 [INFO] -- org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open:472 | Connected to metastore.
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:14 [WARN] -- org.apache.spark.internal.Logging$class.logWarning:87 | Kubernetes client has been closed (this is expected if the application is shutting down.)
06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:14 [WARN] -- org.apache.spark.internal.Logging$class.logWarning:87 | Kubernetes client has been closed (this is expected if the application is shutting down.)
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - Exception in thread "main" org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'xxx' not found;
06-03-2020 12:56:09 CST stage1 INFO - 
06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireDbExists(SessionCatalog.scala:178)

  was:
 
06-03-2020 12:55:59 CST stage1 INFO - 2020-03-06 04:39:00 [INFO] -- org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.<init>:108 | File Output Committer Algorithm version is 106-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 2020-03-06 04:56:05 [WARN] -- org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke:184 | MetaStoreClient lost connection. Attempting to reconnect.06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:129)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:654)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:641)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1158)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at sun.reflect.GeneratedMethodAccessor100.invoke(Unknown Source)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at java.lang.reflect.Method.invoke(Method.java:498)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at com.sun.proxy.$Proxy38.getDatabase(Unknown Source)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1301)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1290)06-03-2020 12:55:59 CST stage1 INFO - 06-03-2020 12:55:59 CST stage1 INFO - 	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$databaseExists$1.apply$mcZ$sp(HiveClientImpl.scala:349)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$databaseExists$1.apply(HiveClientImpl.scala:349)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$databaseExists$1.apply(HiveClientImpl.scala:349)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala:348)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.databaseExists(ExternalCatalogWithListener.scala:69)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.databaseExists(SessionCatalog.scala:243)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireDbExists(SessionCatalog.scala:177)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:316)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:185)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.DataFrameWriter.createTable(DataFrameWriter.scala:474)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:453)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:409)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at com.yitutech.data.HiveOperator.saveToStageSchema(HiveOperator.java:107)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at com.yitutech.handler.SaveStageHandler.lambda$handle$0(SaveStageHandler.java:42)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at java.util.HashMap.forEach(HashMap.java:1289)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at com.yitutech.handler.SaveStageHandler.handle(SaveStageHandler.java:36)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at com.yitutech.AppMain.main(AppMain.java:60)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at java.lang.reflect.Method.invoke(Method.java:498)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - Caused by: java.net.SocketException: Connection reset06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at java.net.SocketInputStream.read(SocketInputStream.java:210)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at java.net.SocketInputStream.read(SocketInputStream.java:141)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	... 70 more06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:06 [INFO] -- org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open:376 | Trying to connect to metastore with URI thrift://hive-metastore-server:908306-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:06 [INFO] -- org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open:472 | Connected to metastore.06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:14 [WARN] -- org.apache.spark.internal.Logging$class.logWarning:87 | Kubernetes client has been closed (this is expected if the application is shutting down.)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:14 [WARN] -- org.apache.spark.internal.Logging$class.logWarning:87 | Kubernetes client has been closed (this is expected if the application is shutting down.)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:14 [WARN] -- org.apache.spark.internal.Logging$class.logWarning:87 | Kubernetes client has been closed (this is expected if the application is shutting down.)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - Exception in thread "main" org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'xxxx' not found;06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireDbExists(SessionCatalog.scala:178)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:316)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:185)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)06-03-2020 12:56:09 CST stage1 INFO - 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
 


> spark on k8s Exception "Database xxx not found" when hive MetaStoreClient lost connection
> -----------------------------------------------------------------------------------------
>
>                 Key: SPARK-31084
>                 URL: https://issues.apache.org/jira/browse/SPARK-31084
>             Project: Spark
>          Issue Type: Question
>          Components: Kubernetes
>    Affects Versions: 2.4.4
>         Environment: spark 2.4.4
>  
>  
>  
>            Reporter: liuxiuyuan
>            Priority: Major
>
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke:184 | MetaStoreClient lost connection. Attempting to reconnect.org.apache.thrift.transport.TTransportException: java.net.SocketException: Connection reset
> 06-03-2020 12:56:09 CST stage1 INFO - Caused by: java.net.SocketException: Connection reset
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 	at java.net.SocketInputStream.read(SocketInputStream.java:210)
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 	at java.net.SocketInputStream.read(SocketInputStream.java:141)
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 	at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:127)
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 	... 70 more
> 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:06 [INFO] -- org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open:376 | Trying to connect to metastore with URI thrift://hive-metastore-server:9083
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:06 [INFO] -- org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open:472 | Connected to metastore.
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:14 [WARN] -- org.apache.spark.internal.Logging$class.logWarning:87 | Kubernetes client has been closed (this is expected if the application is shutting down.)
> 06-03-2020 12:56:09 CST stage1 INFO - 2020-03-06 04:56:14 [WARN] -- org.apache.spark.internal.Logging$class.logWarning:87 | Kubernetes client has been closed (this is expected if the application is shutting down.)
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - Exception in thread "main" org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'xxx' not found;
> 06-03-2020 12:56:09 CST stage1 INFO - 
> 06-03-2020 12:56:09 CST stage1 INFO - 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.org$apache$spark$sql$catalyst$catalog$SessionCatalog$$requireDbExists(SessionCatalog.scala:178)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org