You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/08/10 14:39:00 UTC

[jira] [Commented] (SPARK-40034) PathOutputCommitters to work with dynamic partition overwrite

    [ https://issues.apache.org/jira/browse/SPARK-40034?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17578014#comment-17578014 ] 

Apache Spark commented on SPARK-40034:
--------------------------------------

User 'steveloughran' has created a pull request for this issue:
https://github.com/apache/spark/pull/37468

> PathOutputCommitters to work with dynamic partition overwrite
> -------------------------------------------------------------
>
>                 Key: SPARK-40034
>                 URL: https://issues.apache.org/jira/browse/SPARK-40034
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, SQL
>    Affects Versions: 3.4.0
>            Reporter: Steve Loughran
>            Priority: Minor
>
> sibling of MAPREDUCE-7403: allow PathOutputCommitter implementation to declare that they support the semantics required by spark dynamic partitioning:
> * rename to work as expected
> * working dir to be on same fs as final dir
> They will do this through implementing StreamCapabilities and adding a new probe, "mapreduce.job.committer.dynamic.partitioning" ; the spark side changes are to
> * postpone rejection of dynamic partition overwrite until the output committer is created
> * allow it if the committer implements StreamCapabilities and returns true for {{hasCapability("mapreduce.job.committer.dynamic.partitioning")))
> this isn't going to be supported by the s3a committers, they don't meet the requirements. The manifest committer of MAPREDUCE-7341 running against abfs and gcs does work. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org