You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/01/20 04:24:34 UTC

[jira] [Commented] (SPARK-5190) Allow spark listeners to be added before spark context gets initialized.

    [ https://issues.apache.org/jira/browse/SPARK-5190?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14283361#comment-14283361 ] 

Apache Spark commented on SPARK-5190:
-------------------------------------

User 'JoshRosen' has created a pull request for this issue:
https://github.com/apache/spark/pull/4111

> Allow spark listeners to be added before spark context gets initialized.
> ------------------------------------------------------------------------
>
>                 Key: SPARK-5190
>                 URL: https://issues.apache.org/jira/browse/SPARK-5190
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Kostas Sakellis
>
> Currently, you need the spark context to add spark listener events. But, if you wait until the spark context gets created before adding your listener you might miss events like blockManagerAdded or executorAdded. We should fix this so you can attach a listener to the spark context before it starts any initialization. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org