You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@arrow.apache.org by GitBox <gi...@apache.org> on 2020/10/27 15:05:04 UTC

[GitHub] [arrow] bkietz commented on a change in pull request #8507: ARROW-10131: [C++][Dataset][Python] Lazily parse parquet metadata

bkietz commented on a change in pull request #8507:
URL: https://github.com/apache/arrow/pull/8507#discussion_r512771071



##########
File path: cpp/src/arrow/dataset/file_parquet.cc
##########
@@ -162,8 +148,7 @@ static std::shared_ptr<StructScalar> ColumnChunkStatisticsAsStructScalar(
 
   // Optimize for corner case where all values are nulls
   if (statistics->num_values() == statistics->null_count()) {
-    auto null = MakeNullScalar(field->type());
-    return MakeMinMaxScalar(null, null);
+    return equal(std::move(field_expr), scalar(MakeNullScalar(field->type())));

Review comment:
       Yes, I suppose it'd be more correct to produce `not is_valid(field_expr)` here. `filter=ds.field("col") == pa.NULL` won't select nulls, so I'd like to defer the more correct stats expr to a follow up. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org