You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/06/15 10:58:07 UTC

[GitHub] [airflow] ashb commented on a change in pull request #9081: Get spark driver pod status if log stream interrupted accidentally

ashb commented on a change in pull request #9081:
URL: https://github.com/apache/airflow/pull/9081#discussion_r440094842



##########
File path: airflow/providers/apache/spark/hooks/spark_submit.py
##########
@@ -404,11 +404,16 @@ def submit(self, application="", **kwargs):
         # Check spark-submit return code. In Kubernetes mode, also check the value
         # of exit code in the log, as it may differ.
         if returncode or (self._is_kubernetes and self._spark_exit_code != 0):
-            raise AirflowException(
-                "Cannot execute: {}. Error code is: {}.".format(
-                    self._mask_cmd(spark_submit_cmd), returncode
+            # double check by spark driver pod status (blocking function)
+            spark_driver_pod_status = self._start_k8s_pod_status_tracking()

Review comment:
       This is going to fail hard when not in kubenetes mode.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org