You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/04/27 18:52:39 UTC
[jira] [Updated] (SPARK-7170) Allow to add register SparkListener
specified in SparkConf
[ https://issues.apache.org/jira/browse/SPARK-7170?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-7170:
-----------------------------
Priority: Minor (was: Major)
Affects Version/s: (was: 1.4.0)
> Allow to add register SparkListener specified in SparkConf
> ----------------------------------------------------------
>
> Key: SPARK-7170
> URL: https://issues.apache.org/jira/browse/SPARK-7170
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 1.3.1
> Reporter: Jacek Lewandowski
> Priority: Minor
>
> Currently if the user wants to add a SparkListener to SparkContext, the listener has to be explicitly instantiated and added by calling {{SparkContext.addSparkListener}}.
> h5.Problem
> This is actually not a problem but rather an inconvenience in some cases. Suppose we want to provide a default listener or listeners, common to all the applications in some environment or we want to add a listener to the existing applications, which are complied or we do not want to change their code.
> h5.Proposed solution
> We could just specify {{spark.driver.sparkListeners=a.b.c.SomeListener,...}} and the referenced class(es) would be instantiated and added as listeners automatically when {{SparkContext}} is created. The proposed change does not change the API, it just adds a handler for one more property in Spark configuration.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org