You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Fang-Xie (Jira)" <ji...@apache.org> on 2022/06/15 14:50:00 UTC
[jira] [Commented] (SPARK-39480) Parquet bit-packing de/encode optimization
[ https://issues.apache.org/jira/browse/SPARK-39480?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17554639#comment-17554639 ]
Fang-Xie commented on SPARK-39480:
----------------------------------
!image-2022-06-15-22-48-46-554.png!
> Parquet bit-packing de/encode optimization
> ------------------------------------------
>
> Key: SPARK-39480
> URL: https://issues.apache.org/jira/browse/SPARK-39480
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.3.0
> Reporter: Fang-Xie
> Priority: Major
> Fix For: 3.3.0
>
> Attachments: image-2022-06-15-22-48-46-554.png
>
>
> Current Spark use Parquet-mr as parquet reader/writer library, but the built-in bit-packing en/decode is not efficient enough.
> Our optimization for Parquet bit-packing en/decode with jdk.incubator.vector in Open JDK18 brings prominent performance improvement.
> Due to Vector API is added to OpenJDK since 16, So this optimization request JDK16 or higher.
> *Below are our test results*
> Functional test is based on open-source parquet-mr Bit-pack decoding function: *_public final void unpack8Values(final byte[] in, final int inPos, final int[] out, final int outPos)_* __
> compared with our implementation with vector API *_public final void unpack8Values_vec(final byte[] in, final int inPos, final int[] out, final int outPos)_*
> We tested 10 pairs (open source parquet bit unpacking vs ours optimized vectorized SIMD implementation) decode function with bit width=\{1,2,3,4,5,6,7,8,9,10}, below are test results:
>
--
This message was sent by Atlassian Jira
(v8.20.7#820007)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org