You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/12/06 16:19:59 UTC
[jira] [Commented] (SPARK-18742) readd spark.broadcast.factory for
user-defined BroadcastFactory
[ https://issues.apache.org/jira/browse/SPARK-18742?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15725942#comment-15725942 ]
Apache Spark commented on SPARK-18742:
--------------------------------------
User 'windpiger' has created a pull request for this issue:
https://github.com/apache/spark/pull/16173
> readd spark.broadcast.factory for user-defined BroadcastFactory
> ---------------------------------------------------------------
>
> Key: SPARK-18742
> URL: https://issues.apache.org/jira/browse/SPARK-18742
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Reporter: Song Jun
> Priority: Minor
>
> After SPARK-12588 Remove HTTPBroadcast [1], the one and only
> implementation of BroadcastFactory is TorrentBroadcastFactory. No code
> in Spark 2 uses BroadcastFactory (but TorrentBroadcastFactory) however
> the scaladoc says [2]:
> /**
> * An interface for all the broadcast implementations in Spark (to allow
> * multiple broadcast implementations). SparkContext uses a user-specified
> * BroadcastFactory implementation to instantiate a particular broadcast for the
> * entire Spark job.
> */
> which is not correct since there is no way to plug in a custom
> user-specified BroadcastFactory.
> It is better to readd spark.broadcast.factory for user-defined BroadcastFactory
>
> [1] https://issues.apache.org/jira/browse/SPARK-12588
> [2] https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/broadcast/BroadcastFactory.scala#L25-L30
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org