You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "L. C. Hsieh (Jira)" <ji...@apache.org> on 2023/10/26 21:01:00 UTC

[jira] [Created] (SPARK-45678) Cover BufferReleasingInputStream.available under tryOrFetchFailedException

L. C. Hsieh created SPARK-45678:
-----------------------------------

             Summary: Cover BufferReleasingInputStream.available under tryOrFetchFailedException
                 Key: SPARK-45678
                 URL: https://issues.apache.org/jira/browse/SPARK-45678
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 4.0.0
            Reporter: L. C. Hsieh


We have encountered shuffle data corruption issue:

```
Caused by: java.io.IOException: FAILED_TO_UNCOMPRESS(5)
	at org.xerial.snappy.SnappyNative.throw_error(SnappyNative.java:112)
	at org.xerial.snappy.SnappyNative.rawUncompress(Native Method)
	at org.xerial.snappy.Snappy.rawUncompress(Snappy.java:504)
	at org.xerial.snappy.Snappy.uncompress(Snappy.java:543)
	at org.xerial.snappy.SnappyInputStream.hasNextChunk(SnappyInputStream.java:450)
	at org.xerial.snappy.SnappyInputStream.available(SnappyInputStream.java:497)
	at org.apache.spark.storage.BufferReleasingInputStream.available(ShuffleBlockFetcherIterator.scala:1356)
 ```

Spark shuffle has capacity to detect corruption for a few stream op like `read` and `skip`, such `IOException` in the stack trace will be rethrown as `FetchFailedException` that will re-try the failed shuffle task. But in the stack trace it is `available` that is not covered by the mechanism. So no-retry has been happened and the Spark application just failed.

As the `available` op will also involve data decompression, we should be able to check it like `read` and `skip` do.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org