You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by wangyum <gi...@git.apache.org> on 2018/02/24 16:25:35 UTC

[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

GitHub user wangyum opened a pull request:

    https://github.com/apache/spark/pull/20668

    [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metastore

    ## What changes were proposed in this pull request?
    
    Support Hive 2.2 and Hive 2.3 metastore.
    
    ## How was this patch tested?
    
    Exist tests.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/wangyum/spark SPARK-23510

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20668.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20668
    
----
commit 5b1fc0145efbdd427e8b49bd0f840f709d4bc801
Author: Yuming Wang <yu...@...>
Date:   2018-02-24T16:19:35Z

    Support Hive 2.2 and Hive 2.3

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/1031/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by wangyum <gi...@git.apache.org>.
Github user wangyum commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20668#discussion_r170425667
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala ---
    @@ -202,7 +202,6 @@ private[spark] object HiveUtils extends Logging {
           ConfVars.METASTORE_AGGREGATE_STATS_CACHE_MAX_READER_WAIT -> TimeUnit.MILLISECONDS,
           ConfVars.HIVES_AUTO_PROGRESS_TIMEOUT -> TimeUnit.SECONDS,
           ConfVars.HIVE_LOG_INCREMENTAL_PLAN_PROGRESS_INTERVAL -> TimeUnit.MILLISECONDS,
    -      ConfVars.HIVE_STATS_JDBC_TIMEOUT -> TimeUnit.SECONDS,
    --- End diff --
    
    Remove `HIVE_STATS_JDBC_TIMEOUT ` , 
    more see: https://issues.apache.org/jira/browse/HIVE-12164


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by wangyum <gi...@git.apache.org>.
Github user wangyum commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Otherwise, `SessionCatalogSuite` also needs to be updated
    
    ```scala
    Index: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala
    IDEA additional info:
    Subsystem: com.intellij.openapi.diff.impl.patch.CharsetEP
    <+>UTF-8
    ===================================================================
    --- sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala	(date 1519557876000)
    +++ sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala	(date 1519702924000)
    @@ -955,8 +955,10 @@
           val oldPart1 = catalog.getPartition(TableIdentifier("tbl2", Some("db2")), part1.spec)
           val oldPart2 = catalog.getPartition(TableIdentifier("tbl2", Some("db2")), part2.spec)
           catalog.alterPartitions(TableIdentifier("tbl2", Some("db2")), Seq(
    -        oldPart1.copy(storage = storageFormat.copy(locationUri = Some(newLocation))),
    -        oldPart2.copy(storage = storageFormat.copy(locationUri = Some(newLocation)))))
    +        oldPart1.copy(parameters = oldPart1.parameters,
    +          storage = storageFormat.copy(locationUri = Some(newLocation))),
    +        oldPart2.copy(parameters = oldPart2.parameters,
    +          storage = storageFormat.copy(locationUri = Some(newLocation)))))
           val newPart1 = catalog.getPartition(TableIdentifier("tbl2", Some("db2")), part1.spec)
           val newPart2 = catalog.getPartition(TableIdentifier("tbl2", Some("db2")), part2.spec)
           assert(newPart1.storage.locationUri == Some(newLocation))
    @@ -965,7 +967,9 @@
           assert(oldPart2.storage.locationUri != Some(newLocation))
           // Alter partitions without explicitly specifying database
           catalog.setCurrentDatabase("db2")
    -      catalog.alterPartitions(TableIdentifier("tbl2"), Seq(oldPart1, oldPart2))
    +      catalog.alterPartitions(TableIdentifier("tbl2"),
    +        Seq(oldPart1.copy(parameters = newPart1.parameters),
    +          oldPart2.copy(parameters = newPart2.parameters)))
           val newerPart1 = catalog.getPartition(TableIdentifier("tbl2"), part1.spec)
           val newerPart2 = catalog.getPartition(TableIdentifier("tbl2"), part2.spec)
           assert(oldPart1.storage.locationUri == newerPart1.storage.locationUri)
    
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    **[Test build #87645 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87645/testReport)** for PR 20668 at commit [`5b1fc01`](https://github.com/apache/spark/commit/5b1fc0145efbdd427e8b49bd0f840f709d4bc801).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20668#discussion_r170440954
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala ---
    @@ -1146,3 +1146,25 @@ private[client] class Shim_v2_1 extends Shim_v2_0 {
         alterPartitionsMethod.invoke(hive, tableName, newParts, environmentContextInAlterTable)
       }
     }
    +
    +private[client] class Shim_v2_2 extends Shim_v2_1 {
    +
    +}
    --- End diff --
    
    Please remove `{}`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Based on my understanding, the test failure is caused by a bug in the test case. When doing an alter partition, is that possible that we could alter a partition without `TOTAL_SIZE` and `NUM_FILES`?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by wangyum <gi...@git.apache.org>.
Github user wangyum commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Yes, If we do not add `alterPartitionsMethod`,  [HiveExternalSessionCatalogSuite.alter partitions](https://github.com/apache/spark/blob/d73bb92a72fdd6c1901c070a91b70b845a034e88/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala#L951) will fail, too.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87646/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    **[Test build #87646 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87646/testReport)** for PR 20668 at commit [`48343bc`](https://github.com/apache/spark/commit/48343bc8214468b58dcffcc8d968c870ee0189be).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20668#discussion_r170459915
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala ---
    @@ -202,7 +202,6 @@ private[spark] object HiveUtils extends Logging {
           ConfVars.METASTORE_AGGREGATE_STATS_CACHE_MAX_READER_WAIT -> TimeUnit.MILLISECONDS,
           ConfVars.HIVES_AUTO_PROGRESS_TIMEOUT -> TimeUnit.SECONDS,
           ConfVars.HIVE_LOG_INCREMENTAL_PLAN_PROGRESS_INTERVAL -> TimeUnit.MILLISECONDS,
    -      ConfVars.HIVE_STATS_JDBC_TIMEOUT -> TimeUnit.SECONDS,
    --- End diff --
    
    But we also support all the previous versions


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/1039/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87645/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Also need to update `HiveClientVersions.scala`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by wangyum <gi...@git.apache.org>.
Github user wangyum commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20668#discussion_r170425408
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala ---
    @@ -1146,3 +1146,25 @@ private[client] class Shim_v2_1 extends Shim_v2_0 {
         alterPartitionsMethod.invoke(hive, tableName, newParts, environmentContextInAlterTable)
       }
     }
    +
    +private[client] class Shim_v2_2 extends Shim_v2_1 {
    +
    +}
    +
    +private[client] class Shim_v2_3 extends Shim_v2_2 {
    +
    +  val environmentContext = new EnvironmentContext()
    +  environmentContext.putToProperties("DO_NOT_UPDATE_STATS", "true")
    --- End diff --
    
    Otherwise will throw `NumberFormatException`:
    ```
    [info] Cause: java.lang.NumberFormatException: null
    [info] at java.lang.Long.parseLong(Long.java:552)
    [info] at java.lang.Long.parseLong(Long.java:631)
    [info] at org.apache.hadoop.hive.metastore.MetaStoreUtils.isFastStatsSame(MetaStoreUtils.java:315)
    [info] at org.apache.hadoop.hive.metastore.HiveAlterHandler.alterPartitions(HiveAlterHandler.java:605)
    [info] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_partitions_with_environment_context(HiveMetaStore.java:3837)
    ```
    more see: https://issues.apache.org/jira/browse/HIVE-15653


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by wangyum <gi...@git.apache.org>.
Github user wangyum commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20668#discussion_r170450895
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala ---
    @@ -1146,3 +1146,25 @@ private[client] class Shim_v2_1 extends Shim_v2_0 {
         alterPartitionsMethod.invoke(hive, tableName, newParts, environmentContextInAlterTable)
       }
     }
    +
    +private[client] class Shim_v2_2 extends Shim_v2_1 {
    +
    +}
    +
    +private[client] class Shim_v2_3 extends Shim_v2_2 {
    +
    +  val environmentContext = new EnvironmentContext()
    +  environmentContext.putToProperties("DO_NOT_UPDATE_STATS", "true")
    +
    +  private lazy val alterPartitionsMethod =
    +    findMethod(
    +      classOf[Hive],
    +      "alterPartitions",
    +      classOf[String],
    +      classOf[JList[Partition]],
    +      classOf[EnvironmentContext])
    +
    +  override def alterPartitions(hive: Hive, tableName: String, newParts: JList[Partition]): Unit = {
    --- End diff --
    
    `alterPartitions`:
    ```
    [info] - 2.3: alterPartitions *** FAILED *** (50 milliseconds)
    [info]   java.lang.reflect.InvocationTargetException:
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [info]   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [info]   at java.lang.reflect.Method.invoke(Method.java:498)
    [info]   at org.apache.spark.sql.hive.client.Shim_v2_1.alterPartitions(HiveShim.scala:1144)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply$mcV$sp(HiveClientImpl.scala:616)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.alterPartitions(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:432)
    [info]   at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:424)
    [info]   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
    [info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    [info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:103)
    [info]   at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    [info]   at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    [info]   at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    [info]   at scala.collection.immutable.List.foreach(List.scala:381)
    [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    [info]   at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    [info]   at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    [info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    [info]   at org.scalatest.Suite$class.run(Suite.scala:1147)
    [info]   at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    [info]   at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    [info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:52)
    [info]   at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
    [info]   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
    [info]   at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
    [info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
    [info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
    [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:296)
    [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:286)
    [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    [info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    [info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    [info]   at java.lang.Thread.run(Thread.java:748)
    [info]   Cause: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter partition. java.lang.NumberFormatException: null
    [info]   at org.apache.hadoop.hive.ql.metadata.Hive.alterPartitions(Hive.java:736)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [info]   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [info]   at java.lang.reflect.Method.invoke(Method.java:498)
    [info]   at org.apache.spark.sql.hive.client.Shim_v2_1.alterPartitions(HiveShim.scala:1144)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply$mcV$sp(HiveClientImpl.scala:616)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.alterPartitions(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:432)
    [info]   at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:424)
    [info]   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
    [info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    [info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:103)
    [info]   at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    [info]   at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    [info]   at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    [info]   at scala.collection.immutable.List.foreach(List.scala:381)
    [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    [info]   at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    [info]   at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    [info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    [info]   at org.scalatest.Suite$class.run(Suite.scala:1147)
    [info]   at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    [info]   at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    [info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:52)
    [info]   at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
    [info]   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
    [info]   at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
    [info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
    [info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
    [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:296)
    [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:286)
    [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    [info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    [info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    [info]   at java.lang.Thread.run(Thread.java:748)
    [info]   Cause: org.apache.hadoop.hive.metastore.api.MetaException: java.lang.NumberFormatException: null
    [info]   at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newMetaException(HiveMetaStore.java:6143)
    [info]   at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_partitions_with_environment_context(HiveMetaStore.java:3876)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [info]   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [info]   at java.lang.reflect.Method.invoke(Method.java:498)
    [info]   at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
    [info]   at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
    [info]   at com.sun.proxy.$Proxy101.alter_partitions_with_environment_context(Unknown Source)
    [info]   at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_partitions(HiveMetaStoreClient.java:1527)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [info]   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [info]   at java.lang.reflect.Method.invoke(Method.java:498)
    [info]   at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
    [info]   at com.sun.proxy.$Proxy102.alter_partitions(Unknown Source)
    [info]   at org.apache.hadoop.hive.ql.metadata.Hive.alterPartitions(Hive.java:734)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [info]   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [info]   at java.lang.reflect.Method.invoke(Method.java:498)
    [info]   at org.apache.spark.sql.hive.client.Shim_v2_1.alterPartitions(HiveShim.scala:1144)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply$mcV$sp(HiveClientImpl.scala:616)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.alterPartitions(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:432)
    [info]   at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:424)
    [info]   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
    [info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    [info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:103)
    [info]   at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    [info]   at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    [info]   at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    [info]   at scala.collection.immutable.List.foreach(List.scala:381)
    [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    [info]   at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    [info]   at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    [info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    [info]   at org.scalatest.Suite$class.run(Suite.scala:1147)
    [info]   at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    [info]   at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    [info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:52)
    [info]   at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
    [info]   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
    [info]   at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
    [info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
    [info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
    [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:296)
    [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:286)
    [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    [info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    [info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    [info]   at java.lang.Thread.run(Thread.java:748)
    [info]   Cause: java.lang.NumberFormatException: null
    [info]   at java.lang.Long.parseLong(Long.java:552)
    [info]   at java.lang.Long.parseLong(Long.java:631)
    [info]   at org.apache.hadoop.hive.metastore.MetaStoreUtils.isFastStatsSame(MetaStoreUtils.java:315)
    [info]   at org.apache.hadoop.hive.metastore.HiveAlterHandler.alterPartitions(HiveAlterHandler.java:605)
    [info]   at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_partitions_with_environment_context(HiveMetaStore.java:3837)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [info]   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [info]   at java.lang.reflect.Method.invoke(Method.java:498)
    [info]   at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
    [info]   at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
    [info]   at com.sun.proxy.$Proxy101.alter_partitions_with_environment_context(Unknown Source)
    [info]   at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_partitions(HiveMetaStoreClient.java:1527)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [info]   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [info]   at java.lang.reflect.Method.invoke(Method.java:498)
    [info]   at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
    [info]   at com.sun.proxy.$Proxy102.alter_partitions(Unknown Source)
    [info]   at org.apache.hadoop.hive.ql.metadata.Hive.alterPartitions(Hive.java:734)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [info]   at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    [info]   at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    [info]   at java.lang.reflect.Method.invoke(Method.java:498)
    [info]   at org.apache.spark.sql.hive.client.Shim_v2_1.alterPartitions(HiveShim.scala:1144)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply$mcV$sp(HiveClientImpl.scala:616)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
    [info]   at org.apache.spark.sql.hive.client.HiveClientImpl.alterPartitions(HiveClientImpl.scala:607)
    [info]   at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:432)
    [info]   at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:424)
    [info]   at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
    [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
    [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
    [info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
    [info]   at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:103)
    [info]   at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
    [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
    [info]   at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
    [info]   at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
    [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
    [info]   at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
    [info]   at scala.collection.immutable.List.foreach(List.scala:381)
    [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
    [info]   at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
    [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
    [info]   at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
    [info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
    [info]   at org.scalatest.Suite$class.run(Suite.scala:1147)
    [info]   at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    [info]   at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
    [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
    [info]   at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
    [info]   at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:52)
    [info]   at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
    [info]   at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
    [info]   at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
    [info]   at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
    [info]   at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
    [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:296)
    [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:286)
    [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    [info]   at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    [info]   at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    [info]   at java.lang.Thread.run(Thread.java:748)
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    **[Test build #87652 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87652/testReport)** for PR 20668 at commit [`4db3dc9`](https://github.com/apache/spark/commit/4db3dc9a1a4f6f83289f21a61d0ef51da3a3e232).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by wangyum <gi...@git.apache.org>.
Github user wangyum commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20668#discussion_r170425631
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveUtils.scala ---
    @@ -202,8 +202,6 @@ private[spark] object HiveUtils extends Logging {
           ConfVars.METASTORE_AGGREGATE_STATS_CACHE_MAX_READER_WAIT -> TimeUnit.MILLISECONDS,
           ConfVars.HIVES_AUTO_PROGRESS_TIMEOUT -> TimeUnit.SECONDS,
           ConfVars.HIVE_LOG_INCREMENTAL_PLAN_PROGRESS_INTERVAL -> TimeUnit.MILLISECONDS,
    -      ConfVars.HIVE_STATS_JDBC_TIMEOUT -> TimeUnit.SECONDS,
    -      ConfVars.HIVE_STATS_RETRIES_WAIT -> TimeUnit.MILLISECONDS,
    --- End diff --
    
    Remove `HIVE_STATS_JDBC_TIMEOUT ` , 
    more see: https://issues.apache.org/jira/browse/HIVE-12164


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by wangyum <gi...@git.apache.org>.
Github user wangyum closed the pull request at:

    https://github.com/apache/spark/pull/20668


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Merged build finished. Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20668#discussion_r170441161
  
    --- Diff: sql/hive/src/test/scala/org/apache/spark/sql/hive/client/VersionsSuite.scala ---
    @@ -125,7 +126,7 @@ class VersionsSuite extends SparkFunSuite with Logging {
           // Hive changed the default of datanucleus.schema.autoCreateAll from true to false and
           // hive.metastore.schema.verification from false to true since 2.0
           // For details, see the JIRA HIVE-6113 and HIVE-12463
    -      if (version == "2.0" || version == "2.1") {
    +      if (version.split("\\.").head.toInt > 1) {
    --- End diff --
    
    ```Scala
    if (version == "2.0" || version == "2.1" || version == "2.2" || version == "2.3") {
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    **[Test build #87646 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87646/testReport)** for PR 20668 at commit [`48343bc`](https://github.com/apache/spark/commit/48343bc8214468b58dcffcc8d968c870ee0189be).


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/testing-k8s-prb-make-spark-distribution/1030/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    We hit the test failure?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/87652/
    Test PASSed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 ...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20668#discussion_r170444340
  
    --- Diff: sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala ---
    @@ -1146,3 +1146,25 @@ private[client] class Shim_v2_1 extends Shim_v2_0 {
         alterPartitionsMethod.invoke(hive, tableName, newParts, environmentContextInAlterTable)
       }
     }
    +
    +private[client] class Shim_v2_2 extends Shim_v2_1 {
    +
    +}
    +
    +private[client] class Shim_v2_3 extends Shim_v2_2 {
    +
    +  val environmentContext = new EnvironmentContext()
    +  environmentContext.putToProperties("DO_NOT_UPDATE_STATS", "true")
    +
    +  private lazy val alterPartitionsMethod =
    +    findMethod(
    +      classOf[Hive],
    +      "alterPartitions",
    +      classOf[String],
    +      classOf[JList[Partition]],
    +      classOf[EnvironmentContext])
    +
    +  override def alterPartitions(hive: Hive, tableName: String, newParts: JList[Partition]): Unit = {
    --- End diff --
    
    If we do not add `alterPartitionsMethod `, which test case will fail? 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    Why my PR https://github.com/apache/spark/pull/20671 does not fail?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    **[Test build #87645 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87645/testReport)** for PR 20668 at commit [`5b1fc01`](https://github.com/apache/spark/commit/5b1fc0145efbdd427e8b49bd0f840f709d4bc801).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #20668: [SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metasto...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/20668
  
    **[Test build #87652 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/87652/testReport)** for PR 20668 at commit [`4db3dc9`](https://github.com/apache/spark/commit/4db3dc9a1a4f6f83289f21a61d0ef51da3a3e232).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org