You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 03:59:35 UTC

[jira] [Updated] (SPARK-23061) Support default write mode, settable via spark config

     [ https://issues.apache.org/jira/browse/SPARK-23061?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-23061:
---------------------------------
    Labels: bulk-closed  (was: )

> Support default write mode, settable via spark config
> -----------------------------------------------------
>
>                 Key: SPARK-23061
>                 URL: https://issues.apache.org/jira/browse/SPARK-23061
>             Project: Spark
>          Issue Type: Brainstorming
>          Components: ML, MLlib, SQL
>    Affects Versions: 2.1.0
>            Reporter: Stephan Sahm
>            Priority: Minor
>              Labels: bulk-closed
>
> Especially when using libraries which internally use sparkdataframe.write, but also in general it would be helpful to be able to overwrite the default writing mode from "error" to for instance "overwrite" via a new spark config value



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org