You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/12/15 09:49:59 UTC
[jira] [Updated] (SPARK-18742) Clarify that user-defined
BroadcastFactory is not supported
[ https://issues.apache.org/jira/browse/SPARK-18742?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-18742:
------------------------------
Priority: Trivial (was: Minor)
Component/s: (was: Spark Core)
Documentation
Summary: Clarify that user-defined BroadcastFactory is not supported (was: readd spark.broadcast.factory for user-defined BroadcastFactory)
> Clarify that user-defined BroadcastFactory is not supported
> -----------------------------------------------------------
>
> Key: SPARK-18742
> URL: https://issues.apache.org/jira/browse/SPARK-18742
> Project: Spark
> Issue Type: Improvement
> Components: Documentation
> Reporter: Song Jun
> Priority: Trivial
>
> After SPARK-12588 Remove HTTPBroadcast [1], the one and only
> implementation of BroadcastFactory is TorrentBroadcastFactory. No code
> in Spark 2 uses BroadcastFactory (but TorrentBroadcastFactory) however
> the scaladoc says [2]:
> /**
> * An interface for all the broadcast implementations in Spark (to allow
> * multiple broadcast implementations). SparkContext uses a user-specified
> * BroadcastFactory implementation to instantiate a particular broadcast for the
> * entire Spark job.
> */
> which is not correct since there is no way to plug in a custom
> user-specified BroadcastFactory.
> It is better to readd spark.broadcast.factory for user-defined BroadcastFactory
>
> [1] https://issues.apache.org/jira/browse/SPARK-12588
> [2] https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/broadcast/BroadcastFactory.scala#L25-L30
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org