You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "blcksrx (via GitHub)" <gi...@apache.org> on 2023/07/05 09:07:13 UTC

[GitHub] [airflow] blcksrx commented on issue #32363: K8S Spark Operator doesn't delete an older application with the same name anymore

blcksrx commented on issue #32363:
URL: https://github.com/apache/airflow/issues/32363#issuecomment-1621341214

   Well if you look closely to the older PR you would find the `delete` operation defined in the `KubernetesHook` and also it produces logs about `SparkApplication` beside of this fact that `creating_namespaced_object` is not only about creating the `SparkApplications`, it gives away the single responsibility principal of the method `createing_namespaced_object`.
   In addition, Since it deletes that `namespaced_object`, it rewrites the object history in k8s and it can not be understand why that object failed for the first time!
   ```python
    if "name" in body_dict["metadata"]:
               try:
                   api.delete_namespaced_custom_object(
                       group=group,
                       version=version,
                       namespace=namespace,
                       plural=plural,
                       name=body_dict["metadata"]["name"],
                   )
   
                   self.log.warning("Deleted SparkApplication with the same name")
               except client.rest.ApiException:
                   self.log.info("SparkApplication %s not found", body_dict["metadata"]["name"])
   ```
   It's better to use simple macros on the object name such as  `name-{{ ds }}`. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org