You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Anirudh Ramanathan (JIRA)" <ji...@apache.org> on 2018/07/12 08:08:01 UTC
[jira] [Updated] (SPARK-24793) Make spark-submit more useful with
k8s
[ https://issues.apache.org/jira/browse/SPARK-24793?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Anirudh Ramanathan updated SPARK-24793:
---------------------------------------
Description:
Support controlling the lifecycle of Spark Application through spark-submit.
For example:
```
--kill app_name If given, kills the driver specified.
--status app_name If given, requests the status of the driver specified.
```
Potentially also --list to list all spark drivers running.
Given that our submission client can actually launch jobs into many different namespaces, we'll need an additional specification of the namespace through a --namespace flag potentially.
I think this is pretty useful to have instead of forcing a user to use kubectl to manage the lifecycle of any k8s Spark Application.
was:
Support controlling the lifecycle of Spark Application through spark-submit.
For example:
```
--kill app_name If given, kills the driver specified.
--status app_name If given, requests the status of the driver specified.
```
I think this is pretty useful to have instead of forcing a user to use kubectl to manage the lifecycle of any k8s Spark Application.
> Make spark-submit more useful with k8s
> --------------------------------------
>
> Key: SPARK-24793
> URL: https://issues.apache.org/jira/browse/SPARK-24793
> Project: Spark
> Issue Type: Improvement
> Components: Kubernetes
> Affects Versions: 2.3.0
> Reporter: Anirudh Ramanathan
> Assignee: Anirudh Ramanathan
> Priority: Major
>
> Support controlling the lifecycle of Spark Application through spark-submit.
> For example:
> ```
> --kill app_name If given, kills the driver specified.
> --status app_name If given, requests the status of the driver specified.
> ```
> Potentially also --list to list all spark drivers running.
> Given that our submission client can actually launch jobs into many different namespaces, we'll need an additional specification of the namespace through a --namespace flag potentially.
> I think this is pretty useful to have instead of forcing a user to use kubectl to manage the lifecycle of any k8s Spark Application.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org