You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/05/05 00:03:06 UTC
[jira] [Commented] (SPARK-7236) AkkaUtils askWithReply sleeps
indefinitely when a timeout exception is thrown
[ https://issues.apache.org/jira/browse/SPARK-7236?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14527415#comment-14527415 ]
Apache Spark commented on SPARK-7236:
-------------------------------------
User 'BryanCutler' has created a pull request for this issue:
https://github.com/apache/spark/pull/5896
> AkkaUtils askWithReply sleeps indefinitely when a timeout exception is thrown
> -----------------------------------------------------------------------------
>
> Key: SPARK-7236
> URL: https://issues.apache.org/jira/browse/SPARK-7236
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Reporter: Bryan Cutler
> Priority: Critical
> Labels: quickfix
> Attachments: SparkLongSleepAfterTimeout.scala
>
>
> When {{AkkaUtils.askWithReply}} gets a TimeoutException, the default parameters {{maxAttempts = 1}} and {{retryInterval = Int.MaxValue}} lead to the thread sleeping for a good while.
> I noticed this issue when testing for SPARK-6980 and using this function without invoking Spark jobs, so perhaps it acts differently in another context.
> If this function is on its final attempt to ask and it fails, it should return immediately. Also, perhaps a better default {{retryInterval}} would be {{0}}.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org