You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "XiDuo You (Jira)" <ji...@apache.org> on 2021/11/12 01:16:00 UTC
[jira] [Updated] (SPARK-37287) Pull out dynamic partition and
bucket sort from FileFormatWriter
[ https://issues.apache.org/jira/browse/SPARK-37287?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
XiDuo You updated SPARK-37287:
------------------------------
Description:
FileFormatWriter.write now is used by all V1 write which includes datasource and hive table. However it contains a sort: based on dynamic partition and bucket columns which can not be seen in plan directly.
V2 write has a better approach that satisfies add the order or even distribution using rule `V2Writes`.
V1 write should do the similar thing with V2 write.
was:
FileFormatWriter.write now is used by all V1 write which includes datasource and hive table. However it contains a sort: dynamic partition and bucket columns which can not be seen in plan directly.
V2 write has a better approach that satisfies add the order or even distribution using rule `V2Writes`.
V1 write should do the similar thing with V2 write.
> Pull out dynamic partition and bucket sort from FileFormatWriter
> ----------------------------------------------------------------
>
> Key: SPARK-37287
> URL: https://issues.apache.org/jira/browse/SPARK-37287
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.3.0
> Reporter: XiDuo You
> Priority: Major
>
> FileFormatWriter.write now is used by all V1 write which includes datasource and hive table. However it contains a sort: based on dynamic partition and bucket columns which can not be seen in plan directly.
> V2 write has a better approach that satisfies add the order or even distribution using rule `V2Writes`.
> V1 write should do the similar thing with V2 write.
>
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org