You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "DjvuLee (JIRA)" <ji...@apache.org> on 2015/01/23 06:09:34 UTC

[jira] [Created] (SPARK-5375) Specify more clearly about the max thread meaning in the ConnectionManager

DjvuLee created SPARK-5375:
------------------------------

             Summary: Specify more clearly about the max thread meaning in the ConnectionManager
                 Key: SPARK-5375
                 URL: https://issues.apache.org/jira/browse/SPARK-5375
             Project: Spark
          Issue Type: Improvement
    Affects Versions: 1.1.0
            Reporter: DjvuLee


In the ConnectionManager.scala file, there is three thread pool: handleMessageExecutor,  handleReadWriteExecutor, handleConnectExecutor.

such as:
private val handleMessageExecutor = new ThreadPoolExecutor(
    conf.getInt("spark.core.connection.handler.threads.min", 20),
    conf.getInt("spark.core.connection.handler.threads.max", 60),
    conf.getInt("spark.core.connection.handler.threads.keepalive", 60), TimeUnit.SECONDS,
    new LinkedBlockingDeque[Runnable](),
    Utils.namedThreadFactory("handle-message-executor"))

Since we use a LinkedBlockingDeque, so the max thread parameter have no meaning. Every time I read the code, this  can lead to Confusing for me , Maybe we can add some comment in those place?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org