You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/11/26 15:43:59 UTC

[jira] [Resolved] (SPARK-5190) Allow spark listeners to be added before spark context gets initialized.

     [ https://issues.apache.org/jira/browse/SPARK-5190?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-5190.
------------------------------
       Resolution: Fixed
         Assignee: Josh Rosen
    Fix Version/s: 1.3.0

Resolved by https://github.com/apache/spark/pull/4111

> Allow spark listeners to be added before spark context gets initialized.
> ------------------------------------------------------------------------
>
>                 Key: SPARK-5190
>                 URL: https://issues.apache.org/jira/browse/SPARK-5190
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Kostas Sakellis
>            Assignee: Josh Rosen
>             Fix For: 1.3.0
>
>
> Currently, you need the spark context to add spark listener events. But, if you wait until the spark context gets created before adding your listener you might miss events like blockManagerAdded or executorAdded. We should fix this so you can attach a listener to the spark context before it starts any initialization. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org