You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/06/17 08:53:51 UTC

[GitHub] [airflow] ywan2017 commented on a change in pull request #9081: Get spark driver pod status if log stream interrupted accidentally

ywan2017 commented on a change in pull request #9081:
URL: https://github.com/apache/airflow/pull/9081#discussion_r441388924



##########
File path: airflow/providers/apache/spark/hooks/spark_submit.py
##########
@@ -404,11 +404,16 @@ def submit(self, application="", **kwargs):
         # Check spark-submit return code. In Kubernetes mode, also check the value
         # of exit code in the log, as it may differ.
         if returncode or (self._is_kubernetes and self._spark_exit_code != 0):
-            raise AirflowException(
-                "Cannot execute: {}. Error code is: {}.".format(
-                    self._mask_cmd(spark_submit_cmd), returncode
+            # double check by spark driver pod status (blocking function)
+            spark_driver_pod_status = self._start_k8s_pod_status_tracking()

Review comment:
       Yes, thanks for that, I've split the 'if' conditions to remove the influence when not in k8s mode.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org