You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "tone (JIRA)" <ji...@apache.org> on 2016/08/31 10:18:20 UTC

[jira] [Created] (SPARK-17330) Fix the failure Spark UT (SPARK-8368) case

tone created SPARK-17330:
----------------------------

             Summary: Fix the failure Spark UT (SPARK-8368) case
                 Key: SPARK-17330
                 URL: https://issues.apache.org/jira/browse/SPARK-17330
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.1.0
            Reporter: tone


When run Spark UT based on the latest version of master branch, the UT case (SPARK-8368) can be passed at the first time, but always fail if run it again. The error log is as below:
[info]   2016-08-31 09:35:51.967 - stderr> 16/08/31 09:35:51 ERROR RetryingHMSHandler: AlreadyExistsException(message:Database default already exists)
[info]   2016-08-31 09:35:51.967 - stderr>      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:891)
[info]   2016-08-31 09:35:51.967 - stderr>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info]   2016-08-31 09:35:51.967 - stderr>      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info]   2016-08-31 09:35:51.967 - stderr>      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info]   2016-08-31 09:35:51.967 - stderr>      at java.lang.reflect.Method.invoke(Method.java:498)
[info]   2016-08-31 09:35:51.967 - stderr>      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
[info]   2016-08-31 09:35:51.967 - stderr>      at com.sun.proxy.$Proxy18.create_database(Unknown Source)
[info]   2016-08-31 09:35:51.967 - stderr>      at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:644)
[info]   2016-08-31 09:35:51.967 - stderr>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info]   2016-08-31 09:35:51.967 - stderr>      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info]   2016-08-31 09:35:51.967 - stderr>      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info]   2016-08-31 09:35:51.967 - stderr>      at java.lang.reflect.Method.invoke(Method.java:498)
[info]   2016-08-31 09:35:51.967 - stderr>      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
[info]   2016-08-31 09:35:51.968 - stderr>      at com.sun.proxy.$Proxy19.createDatabase(Unknown Source)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:306)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply$mcV$sp(HiveClientImpl.scala:310)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply(HiveClientImpl.scala:310)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply(HiveClientImpl.scala:310)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:281)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:228)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:227)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:270)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.client.HiveClientImpl.createDatabase(HiveClientImpl.scala:309)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createDatabase$1.apply$mcV$sp(HiveExternalCatalog.scala:120)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createDatabase$1.apply(HiveExternalCatalog.scala:120)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$createDatabase$1.apply(HiveExternalCatalog.scala:120)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:87)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveExternalCatalog.createDatabase(HiveExternalCatalog.scala:119)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:147)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:89)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveSessionCatalog.<init>(HiveSessionCatalog.scala:49)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:46)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:45)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:59)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:59)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:58)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:61)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:261)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.SQLContext.createDataFrame(SQLContext.scala:290)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.SparkSubmitClassLoaderTest$.main(HiveSparkSubmitSuite.scala:596)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.sql.hive.SparkSubmitClassLoaderTest.main(HiveSparkSubmitSuite.scala)
[info]   2016-08-31 09:35:51.968 - stderr>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info]   2016-08-31 09:35:51.968 - stderr>      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info]   2016-08-31 09:35:51.968 - stderr>      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info]   2016-08-31 09:35:51.968 - stderr>      at java.lang.reflect.Method.invoke(Method.java:498)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
[info]   2016-08-31 09:35:51.968 - stderr>      at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

From the log, the database file has been existed when run the UT case. Need to fix the issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org