You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Weiyi Kong (Jira)" <ji...@apache.org> on 2020/10/27 02:45:00 UTC

[jira] [Updated] (SPARK-33249) Add status plugin for live application

     [ https://issues.apache.org/jira/browse/SPARK-33249?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Weiyi Kong updated SPARK-33249:
-------------------------------
    Description: 
There are cases that developer may want to extend the current REST API of Web UI. In most cases, adding external module is a better option than directly editing the original Spark code.

For an external module, to extend the REST API of the Web UI, 2 things may need to be done:
 * Add extra API to provide extra status info. This can be simply done by implementing another ApiRequestContext which will be automatically loaded.
 * If the info can not be calculated from the original data in the store, add extra listeners to generate them.

For history server, there is an interface called AppHistoryServerPlugin, which is loaded based on SPI, providing a method to create listeners. In live application, the only way is spark.extraListeners based on Utils.loadExtensions. But this is not enough for the cases.

To let the API get the status info, the data need to be written to the AppStatusStore, which is the only store that an API can get by accessing "ui.store" or "ui.sc.statusStore". But listeners created by Utils.loadExtensions only get a SparkConf in construction, and are unable to write the AppStatusStore.

So I think we still need plugin like AppHistorySever for live UI. For concerns like SPARK-22786, the plugin for live app can be separated from the history server one, and also loaded using Utils.loadExtensions with an extra configurations. So by default, nothing will be loaded.

  was:
There are cases that developer may want to extend the current REST API of Web UI. In most cases, adding external module is a better option than directly editing the original Spark code.

For an external module, to extend the REST API of the Web UI, 2 things may need to be done:
 * Add extra API to provide extra status info. This can be simply done by implementing another ApiRequestContext which will be automatically loaded.
 * If the info can not be calculated from the original data in the store, add extra listeners to generate them.

For history server, there is an interface called AppHistoryServerPlugin, which is loaded based on SPI, providing a method to create listeners. In live application, the only way is spark.extraListeners based on Utils.loadExtensions. But this is not enough for the cases.

To let the API get the status info, the data need to be written to the AppStatusStore, which is the only store that an API can get by accessing "ui.store" or "ui.sc.statusStore". But listeners created by Utils.loadExtensions only get a SparkConf in construction, and are unable to write the AppStatusStore.

So I think we still need plugin like AppHistorySever for live UI. For concerns like [#SPARK-22786], the plugin for live app can be separated from the history server one, and also loaded using Utils.loadExtensions with an extra configurations. So by default, nothing will be loaded.


> Add status plugin for live application
> --------------------------------------
>
>                 Key: SPARK-33249
>                 URL: https://issues.apache.org/jira/browse/SPARK-33249
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core, Web UI
>    Affects Versions: 2.4.7, 3.0.1
>            Reporter: Weiyi Kong
>            Priority: Minor
>
> There are cases that developer may want to extend the current REST API of Web UI. In most cases, adding external module is a better option than directly editing the original Spark code.
> For an external module, to extend the REST API of the Web UI, 2 things may need to be done:
>  * Add extra API to provide extra status info. This can be simply done by implementing another ApiRequestContext which will be automatically loaded.
>  * If the info can not be calculated from the original data in the store, add extra listeners to generate them.
> For history server, there is an interface called AppHistoryServerPlugin, which is loaded based on SPI, providing a method to create listeners. In live application, the only way is spark.extraListeners based on Utils.loadExtensions. But this is not enough for the cases.
> To let the API get the status info, the data need to be written to the AppStatusStore, which is the only store that an API can get by accessing "ui.store" or "ui.sc.statusStore". But listeners created by Utils.loadExtensions only get a SparkConf in construction, and are unable to write the AppStatusStore.
> So I think we still need plugin like AppHistorySever for live UI. For concerns like SPARK-22786, the plugin for live app can be separated from the history server one, and also loaded using Utils.loadExtensions with an extra configurations. So by default, nothing will be loaded.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org