You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by BryanCutler <gi...@git.apache.org> on 2015/06/01 02:22:13 UTC

[GitHub] spark pull request: [SPARK-6980] [CORE] [WIP] Akka timeout excepti...

Github user BryanCutler commented on the pull request:

    https://github.com/apache/spark/pull/6205#issuecomment-107262210
  
    @squito I was thinking to use this in place of what we have, but actually it wouldn't cover the case where `Await.result` times out while the future is still not completed.  To cover all cases we could add a function in `RpcTimeout` that would take a `Future[T]` as input and use `recover` (as in the code above) to  handle a timeout on the future.  Then we would have to subclass `TimeoutException`, as you suggested, so we don't end up amending the message twice.  If you want, I could code this up to see how it would look?
    
    One thing I'm not too sure of though, when using `recover` it requires an `ExecutionContext`.  In the test code I posted, it was fine to import the implicit conversion to the default global context, but I don't know if the same holds true for the rest of Spark.  Any thoughts on this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org