You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/08/20 07:20:22 UTC

[GitHub] [airflow] jherrmannNetfonds commented on issue #16290: Allow deleting existing spark application before creating new one via SparkKubernetesOperator in Kubernetes

jherrmannNetfonds commented on issue #16290:
URL: https://github.com/apache/airflow/issues/16290#issuecomment-902489540


   Hi, I modified the `SparkKubernetesSensor` so it deletes the spark application if it finished successfully. For retry I have `{{ task_instance.try_number }}` for the SparkApplication name so that I still can have a look at the logs.
   
   My proposal is to add an option to the Sensor to delete the spark application via a flag `ALWAYS`, `ON_FAILURE`,`ON_SUCCESS`.
   What do you guys think?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org