You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Imran Rashid (Jira)" <ji...@apache.org> on 2019/10/10 15:58:00 UTC

[jira] [Commented] (SPARK-29396) Extend Spark plugin interface to driver

    [ https://issues.apache.org/jira/browse/SPARK-29396?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16948720#comment-16948720 ] 

Imran Rashid commented on SPARK-29396:
--------------------------------------

My hack to get around this in the past was to create a "SparkListener" which just ignored all the events it got, as that lets you instantiate arbitrary code in the driver, after most initialization but before running anything else.  Its an ugly api for sure, so it would be nice to improve -- but I'm curious if there is a functional shortcoming you need to address as well?

> Extend Spark plugin interface to driver
> ---------------------------------------
>
>                 Key: SPARK-29396
>                 URL: https://issues.apache.org/jira/browse/SPARK-29396
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Marcelo Masiero Vanzin
>            Priority: Major
>
> Spark provides an extension API for people to implement executor plugins, added in SPARK-24918 and later extended in SPARK-28091.
> That API does not offer any functionality for doing similar things on the driver side, though. As a consequence of that, there is not a good way for the executor plugins to get information or communicate in any way with the Spark driver.
> I've been playing with such an improved API for developing some new functionality. I'll file a few child bugs for the work to get the changes in.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org