You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/05 06:42:35 UTC

[GitHub] [spark] sadikovi commented on a diff in pull request #37419: [SPARK-39833][SQL] Fix Parquet incorrect count issue when requiredSchema is empty and column index is enabled

sadikovi commented on code in PR #37419:
URL: https://github.com/apache/spark/pull/37419#discussion_r938492838


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala:
##########
@@ -228,6 +228,13 @@ class ParquetFileFormat
       SQLConf.PARQUET_TIMESTAMP_NTZ_ENABLED.key,
       sparkSession.sessionState.conf.parquetTimestampNTZEnabled)
 
+    // See PARQUET-2170.
+    // Disable column index optimisation when required schema is empty so we get the correct
+    // row count from parquet-mr.
+    if (requiredSchema.isEmpty) {

Review Comment:
   No, this is not required for DSv2.
   
   The test works in DSv2 due to another inconsistency - Parquet DSv2 does not consider the full file schema when creating pushdown filters. There is a check in FileScanBuilder to ignore partition columns so in this case, the schema is empty so no filters will be pushed down, returning the correct number of records. It is rather a performance inefficiency in DSv2 as the entire file will be scanned. However, the result will be correct.
   
   I thought about fixing it the same way DSv2 fixed the issue but it is a much bigger change as it would affect not just this case but others as well. I hope my explanation makes sense. Let me know your thoughts.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org