You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xianyang Liu (JIRA)" <ji...@apache.org> on 2017/07/18 13:04:00 UTC

[jira] [Created] (SPARK-21455) RpcFailure should be call on RpcResponseCallback.onFailure

Xianyang Liu created SPARK-21455:
------------------------------------

             Summary: RpcFailure should be call on RpcResponseCallback.onFailure
                 Key: SPARK-21455
                 URL: https://issues.apache.org/jira/browse/SPARK-21455
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.0
            Reporter: Xianyang Liu


Currently, when there is a `RpcFailure` need be sent back to client, we call `RpcCallContext.sendFailure`, then it will call `NettyRpcCallContext.send`. However, we can see the follow code snippets in the implementation class.

```
private[netty] class RemoteNettyRpcCallContext(
    nettyEnv: NettyRpcEnv,
    callback: RpcResponseCallback,
    senderAddress: RpcAddress)
  extends NettyRpcCallContext(senderAddress) {

  override protected def send(message: Any): Unit = {
    val reply = nettyEnv.serialize(message)
    callback.onSuccess(reply)
  }
}

```

This is unreasonable, there are two reasons:
# Send back failure message by `RpcResponseCallback.onSuccess`, we can get the details exception messages(such as `StackTrace`) in currently implements.
# `RpcResponseCallback.onSuccess` and `RpcResponseCallback.onFailure` could have different behavior. Such as:
NettyBlockTransferService#uploadBlock

```
new RpcResponseCallback {
        override def onSuccess(response: ByteBuffer): Unit = {
          logTrace(s"Successfully uploaded block $blockId")
          result.success((): Unit)
        }
        override def onFailure(e: Throwable): Unit = {
          logError(s"Error while uploading block $blockId", e)
          result.failure(e)
        }
      })
```



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org