You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/01/26 23:07:35 UTC
[jira] [Commented] (SPARK-5411) Allow SparkListeners to be
specified in SparkConf and loaded when creating SparkContext
[ https://issues.apache.org/jira/browse/SPARK-5411?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14292494#comment-14292494 ]
Apache Spark commented on SPARK-5411:
-------------------------------------
User 'JoshRosen' has created a pull request for this issue:
https://github.com/apache/spark/pull/4111
> Allow SparkListeners to be specified in SparkConf and loaded when creating SparkContext
> ---------------------------------------------------------------------------------------
>
> Key: SPARK-5411
> URL: https://issues.apache.org/jira/browse/SPARK-5411
> Project: Spark
> Issue Type: New Feature
> Components: Spark Core
> Reporter: Josh Rosen
> Assignee: Josh Rosen
>
> It would be nice if there was a mechanism to allow SparkListeners to be registered through SparkConf settings. This would allow monitoring frameworks to be easily injected into Spark programs without having to modify those programs' code.
> I propose to introduce a new configuration option, {{spark.extraListeners}}, that allows SparkListeners to be specified in SparkConf and registered before the SparkContext is created. Here is the proposed documentation for the new option:
> {quote}
> A comma-separated list of classes that implement SparkListener; when initializing SparkContext, instances of these classes will be created and registered with Spark's listener bus. If a class has a single-argument constructor that accepts a SparkConf, that constructor will be called; otherwise, a zero-argument constructor will be called. If no valid constructor can be found, the SparkContext creation will fail with an exception.
> {quote}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org