You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/03/01 22:27:18 UTC
[jira] [Commented] (SPARK-13601) Invoke task failure callbacks
before calling outputstream.close()
[ https://issues.apache.org/jira/browse/SPARK-13601?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15174433#comment-15174433 ]
Apache Spark commented on SPARK-13601:
--------------------------------------
User 'davies' has created a pull request for this issue:
https://github.com/apache/spark/pull/11450
> Invoke task failure callbacks before calling outputstream.close()
> -----------------------------------------------------------------
>
> Key: SPARK-13601
> URL: https://issues.apache.org/jira/browse/SPARK-13601
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Reporter: Davies Liu
> Assignee: Davies Liu
>
> We need to submit another PR against Spark to call the task failure callbacks before Spark calls the close function on various output streams.
> For example, we need to intercept an exception and call TaskContext.markTaskFailed before calling close in the following code (in PairRDDFunctions.scala):
> {code}
> Utils.tryWithSafeFinally {
> while (iter.hasNext) {
> val record = iter.next()
> writer.write(record._1.asInstanceOf[AnyRef], record._2.asInstanceOf[AnyRef])
> // Update bytes written metric every few records
> maybeUpdateOutputMetrics(outputMetricsAndBytesWrittenCallback, recordsWritten)
> recordsWritten += 1
> }
> } {
> writer.close()
> }
> {code}
> Changes to Spark should include unit tests to make sure this always work in the future.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org