You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/09/25 08:30:34 UTC

[jira] [Commented] (SPARK-3687) Spark hang while processing more than 100 sequence files

    [ https://issues.apache.org/jira/browse/SPARK-3687?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14147465#comment-14147465 ] 

Patrick Wendell commented on SPARK-3687:
----------------------------------------

Can you perform a jstack on the executor when it is hanging? We usually only post things on JIRA like this when a specific issue has been debugged a bit more. But if you can produce a jstack of the hung executor we can keep it open :)

> Spark hang while processing more than 100 sequence files
> --------------------------------------------------------
>
>                 Key: SPARK-3687
>                 URL: https://issues.apache.org/jira/browse/SPARK-3687
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.2, 1.1.0
>            Reporter: Ziv Huang
>
> In my application, I read more than 100 sequence files to a JavaPairRDD, perform flatmap to get another JavaRDD, and then use takeOrdered to get the result.
> It is quite often (but not always) that the spark hangs while the executing some of 110th-130th tasks.
> The job can hang for several hours, maybe forever (I can't wait for its completion).
> When the spark job hangs, I can't find any error message in anywhere, and I can't kill the job from web UI.
> The current workaround is to use coalesce to reduce the number of partitions to be processed.
> I never get a job hanged if the number of partitions to be processed is no greater than 80.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org