You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Imran Rashid (JIRA)" <ji...@apache.org> on 2019/04/17 17:04:01 UTC
[jira] [Resolved] (SPARK-24936) Better error message when trying a
shuffle fetch over 2 GB
[ https://issues.apache.org/jira/browse/SPARK-24936?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Imran Rashid resolved SPARK-24936.
----------------------------------
Resolution: Won't Fix
As we've already shipped 2.4, I think its unlikely we're going to fix this later. I don't think we need to worry that much about spark 3.0 talking to shuffle services < 2.2.
If anybody is motivated, feel free to submit a pr here, but I think leaving this open is probably misleading about the status.
> Better error message when trying a shuffle fetch over 2 GB
> ----------------------------------------------------------
>
> Key: SPARK-24936
> URL: https://issues.apache.org/jira/browse/SPARK-24936
> Project: Spark
> Issue Type: Sub-task
> Components: Spark Core
> Affects Versions: 2.4.0
> Reporter: Imran Rashid
> Priority: Major
>
> After SPARK-24297, spark will try to fetch shuffle blocks to disk if their over 2GB. However, this will fail with an external shuffle service running < spark 2.2, with an unhelpful error message like:
> {noformat}
> 18/07/26 07:15:02 WARN scheduler.TaskSetManager: Lost task 1.0 in stage 1.3 (TID 15, xyz.com, executor 2): FetchFailed(BlockManagerId(1
> , xyz.com, 7337, None), shuffleId=0, mapId=1, reduceId=1, message=
> org.apache.spark.shuffle.FetchFailedException: java.lang.UnsupportedOperationException
> at org.apache.spark.network.server.StreamManager.openStream(StreamManager.java:60)
> at org.apache.spark.network.server.TransportRequestHandler.processStreamRequest(TransportRequestHandler.java:136)
> ...
> {noformat}
> We can't do anything to make the shuffle succeed, in this situation, but we should fail with a better error message.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org