You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2014/08/15 04:15:18 UTC

[jira] [Resolved] (SPARK-2940) Support fetching multiple blocks in a single request in Netty network module

     [ https://issues.apache.org/jira/browse/SPARK-2940?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Reynold Xin resolved SPARK-2940.
--------------------------------

       Resolution: Fixed
    Fix Version/s: 1.1.0

> Support fetching multiple blocks in a single request in Netty network module
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-2940
>                 URL: https://issues.apache.org/jira/browse/SPARK-2940
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Shuffle, Spark Core
>            Reporter: Reynold Xin
>            Assignee: Reynold Xin
>             Fix For: 1.1.0
>
>
> ShuffleCopier.getBlock gets one block at a time (in one request).
> {code}
>   def getBlock(host: String, port: Int, blockId: BlockId,
>       resultCollectCallback: (BlockId, Long, ByteBuf) => Unit) {
>     val handler = new ShuffleCopier.ShuffleClientHandler(resultCollectCallback)
>     val connectTimeout = conf.getInt("spark.shuffle.netty.connect.timeout", 60000)
>     val fc = new FileClient(handler, connectTimeout)
>     try {
>       fc.init()
>       fc.connect(host, port)
>       fc.sendRequest(blockId.name)
>       fc.waitForClose()
>       fc.close()
>     } catch {
>       // Handle any socket-related exceptions in FileClient
>       case e: Exception => {
>         logError("Shuffle copy of block " + blockId + " from " + host + ":" + port + " failed", e)
>         handler.handleError(blockId)
>       }
>     }
>   }
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org