You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Imran Rashid (JIRA)" <ji...@apache.org> on 2015/07/03 04:37:04 UTC

[jira] [Updated] (SPARK-6980) Akka timeout exceptions indicate which conf controls them

     [ https://issues.apache.org/jira/browse/SPARK-6980?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Imran Rashid updated SPARK-6980:
--------------------------------
    Assignee: Bryan Cutler  (was: Harsh Gupta)

> Akka timeout exceptions indicate which conf controls them
> ---------------------------------------------------------
>
>                 Key: SPARK-6980
>                 URL: https://issues.apache.org/jira/browse/SPARK-6980
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Imran Rashid
>            Assignee: Bryan Cutler
>            Priority: Minor
>              Labels: starter
>             Fix For: 1.5.0
>
>         Attachments: Spark-6980-Test.scala
>
>
> If you hit one of the akka timeouts, you just get an exception like
> {code}
> java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
> {code}
> The exception doesn't indicate how to change the timeout, though there is usually (always?) a corresponding setting in {{SparkConf}} .  It would be nice if the exception including the relevant setting.
> I think this should be pretty easy to do -- we just need to create something like a {{NamedTimeout}}.  It would have its own {{await}} method, catches the akka timeout and throws its own exception.  We should change {{RpcUtils.askTimeout}} and {{RpcUtils.lookupTimeout}} to always give a {{NamedTimeout}}, so we can be sure that anytime we have a timeout, we get a better exception.
> Given the latest refactoring to the rpc layer, this needs to be done in both {{AkkaUtils}} and {{AkkaRpcEndpoint}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org