You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2022/10/09 10:50:00 UTC
[jira] [Updated] (SPARK-40719) CTAS should respect TBLPROPERTIES during execution
[ https://issues.apache.org/jira/browse/SPARK-40719?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-40719:
----------------------------------
Description:
{code}
spark-sql> CREATE TABLE t1 STORED AS PARQUET TBLPROPERTIES (parquet.compression 'zstd') AS SELECT 1;
$ ls spark-warehouse/t1
_SUCCESS
part-00000-c28a99f0-a88f-448d-a60d-e8204bf2f3a7-c000.zstd.parquet
{code}
{code}
spark-sql> CREATE TABLE t2 USING PARQUET TBLPROPERTIES (parquet.compression 'zstd') AS SELECT 1;
$ ls spark-warehouse/t2 _SUCCESS part-00000-3c5853d0-308d-4571-86c3-3e1a31eacfe6-c000.snappy.parquet
{code}
was:
{code:java}
spark-sql> CREATE TABLE t1 STORED AS PARQUET TBLPROPERTIES (parquet.compression 'zstd') AS SELECT 1;
$ ls spark-warehouse/t1
_SUCCESS
part-00000-c28a99f0-a88f-448d-a60d-e8204bf2f3a7-c000.zstd.parquet{code}
> CTAS should respect TBLPROPERTIES during execution
> --------------------------------------------------
>
> Key: SPARK-40719
> URL: https://issues.apache.org/jira/browse/SPARK-40719
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.3.0
> Reporter: Dongjoon Hyun
> Priority: Major
>
> {code}
> spark-sql> CREATE TABLE t1 STORED AS PARQUET TBLPROPERTIES (parquet.compression 'zstd') AS SELECT 1;
> $ ls spark-warehouse/t1
> _SUCCESS
> part-00000-c28a99f0-a88f-448d-a60d-e8204bf2f3a7-c000.zstd.parquet
> {code}
> {code}
> spark-sql> CREATE TABLE t2 USING PARQUET TBLPROPERTIES (parquet.compression 'zstd') AS SELECT 1;
> $ ls spark-warehouse/t2 _SUCCESS part-00000-3c5853d0-308d-4571-86c3-3e1a31eacfe6-c000.snappy.parquet
> {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org