You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@arrow.apache.org by GitBox <gi...@apache.org> on 2021/07/16 05:37:08 UTC

[GitHub] [arrow] westonpace commented on a change in pull request #10729: ARROW-12513: [C++][Parquet] Parquet Writer always puts null_count=0 in Parquet statistics for dictionary-encoded array with nulls

westonpace commented on a change in pull request #10729:
URL: https://github.com/apache/arrow/pull/10729#discussion_r670978233



##########
File path: cpp/src/parquet/statistics.cc
##########
@@ -567,6 +568,27 @@ class TypedStatisticsImpl : public TypedStatistics<DType> {
     SetMinMaxPair(comparator_->GetMinMax(values));
   }
 
+  void UpdateArrowDictionary(const ::arrow::Array& indices,
+                             const ::arrow::Array& dictionary) {
+    IncrementNullCount(indices.null_count());
+    IncrementNumValues(indices.length() - indices.null_count());
+
+    if (indices.null_count() == indices.length()) {
+      return;
+    }
+
+    ::arrow::compute::ExecContext ctx(pool_);
+    PARQUET_ASSIGN_OR_THROW(auto referenced_indices,
+                            ::arrow::compute::Unique(indices, &ctx));
+    PARQUET_ASSIGN_OR_THROW(

Review comment:
       I'll create a follow-up.  Another approach I considered was that we could compute the unique indices and then pass those down as a selection filter to the GetMinMax function.  My goal was minimizing change to the existing code.
   
   Or, since we have all these compute kernels now, just use ::arrow::compute::MinMax and pass the min & max into the statistics directly.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@arrow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org