You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/08/17 17:23:00 UTC
[jira] [Assigned] (SPARK-40128) Add DELTA_LENGTH_BYTE_ARRAY as a recognized standalone encoding in VectorizedColumnReader
[ https://issues.apache.org/jira/browse/SPARK-40128?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-40128:
------------------------------------
Assignee: (was: Apache Spark)
> Add DELTA_LENGTH_BYTE_ARRAY as a recognized standalone encoding in VectorizedColumnReader
> -----------------------------------------------------------------------------------------
>
> Key: SPARK-40128
> URL: https://issues.apache.org/jira/browse/SPARK-40128
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.3.0
> Reporter: Dennis Huo
> Priority: Major
> Attachments: delta_length_byte_array.parquet
>
>
> Even though https://issues.apache.org/jira/browse/SPARK-36879 added implementations for DELTA_BINARY_PACKED, DELTA_BYTE_ARRAY, and DELTA_LENGTH_BYTE_ARRAY encodings, only DELTA_BINARY_PACKED and DELTA_BYTE_ARRAY were added as top-level standalone column encodings, with DELTA_LENGTH_BYTE_ARRAY only being used in its capacity as a subcomponent of DELTA_BYTE_ARRAY (for the non-shared string/binary suffixes).
> Even though there apparently aren't many writers of the standalone DELTA_LENGTH_BYTE_ARRAY encoding, it's part of the core Parquet spec: https://parquet.apache.org/docs/file-format/data-pages/encodings/#delta-length-byte-array-delta_length_byte_array–6 and could be more efficient for types of binary/string data that don't take good advantage of sharing common prefixes for incremental encoding.
> The problem and be reproduced by trying to load one of the [https://github.com/apache/parquet-testing] files (delta_length_byte_array.parquet).
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org