You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/10/08 05:42:12 UTC

[jira] [Resolved] (SPARK-24515) No need to warning user when output commit coordination enabled

     [ https://issues.apache.org/jira/browse/SPARK-24515?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-24515.
----------------------------------
    Resolution: Incomplete

> No need to warning user when output commit coordination enabled
> ---------------------------------------------------------------
>
>                 Key: SPARK-24515
>                 URL: https://issues.apache.org/jira/browse/SPARK-24515
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.3.1
>            Reporter: zhoukang
>            Priority: Major
>              Labels: bulk-closed
>
>  
> No need to warning user when output commit coordination enabled
> {code:java}
> // When speculation is on and output committer class name contains "Direct", we should warn
> // users that they may loss data if they are using a direct output committer.
> val speculationEnabled = self.conf.getBoolean("spark.speculation", false)
> val outputCommitterClass = hadoopConf.get("mapred.output.committer.class", "")
> if (speculationEnabled && outputCommitterClass.contains("Direct")) {
>  val warningMessage =
>  s"$outputCommitterClass may be an output committer that writes data directly to " +
>  "the final location. Because speculation is enabled, this output committer may " +
>  "cause data loss (see the case in SPARK-10063). If possible, please use an output " +
>  "committer that does not have this behavior (e.g. FileOutputCommitter)."
>  logWarning(warningMessage)
> }
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org