You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/10/23 20:35:18 UTC

[GitHub] [spark] viirya commented on a change in pull request #32583: [SPARK-35437][SQL] Use expressions to filter Hive partitions at client side

viirya commented on a change in pull request #32583:
URL: https://github.com/apache/spark/pull/32583#discussion_r735009676



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -1008,6 +1008,16 @@ object SQLConf {
       .booleanConf
       .createWithDefault(false)
 
+  val HIVE_METASTORE_PARTITION_PRUNING_FAST_FALLBACK =
+    buildConf("spark.sql.hive.metastorePartitionPruningFastFallback")
+      .doc("When true and " +
+        s"we cannot do filtering on the server(${HIVE_METASTORE_PARTITION_PRUNING.key}), " +

Review comment:
       Can we rephrase this doc? "When this config is enabled, if the predicates are not supported by Hive or Spark does fallback due to encountering MetaException from the metastore, Spark will ...."

##########
File path: sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveShim.scala
##########
@@ -900,13 +904,66 @@ private[client] class Shim_v0_13 extends Shim_v0_12 {
               s"${SQLConf.HIVE_METASTORE_PARTITION_PRUNING_FALLBACK_ON_EXCEPTION.key} " +
               " to false and let the query fail instead.", ex)

Review comment:
       I'd recommend to update the warning message here too. We can mention that users can enable this new config to alleviate performance downgrade.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org