You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yeachan Park (Jira)" <ji...@apache.org> on 2022/07/11 10:07:00 UTC

[jira] [Created] (SPARK-39743) Unable to set zstd compression level while writing parquet

Yeachan Park created SPARK-39743:
------------------------------------

             Summary: Unable to set zstd compression level while writing parquet
                 Key: SPARK-39743
                 URL: https://issues.apache.org/jira/browse/SPARK-39743
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.2.0
            Reporter: Yeachan Park


While writing zstd compressed parquet files, the following setting `spark.io.compression.zstd.level` does not have any affect with regards to the compression level of zstd.

All files seem to be written with the default zstd compression level, and the config option seems to be ignored.

Using the zstd cli tool, we confirmed that setting a higher compression level for the same file tested in spark resulted in a smaller file.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org