You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2019/06/03 13:18:30 UTC

[GitHub] [airflow] andy-g14 commented on issue #5347: [AIRFLOW-4598] Task retries are not exhausted for K8s executor

andy-g14 commented on issue #5347: [AIRFLOW-4598] Task retries are not exhausted for K8s executor
URL: https://github.com/apache/airflow/pull/5347#issuecomment-498252826
 
 
   @dimberman The section of code I have removed is marking task as failed if there is a "Failed" event from the WatcherJob, without checking eligibility for retries. 
   For your question, the tasks would be marked failed here https://github.com/apache/airflow/blob/e8c5c7a3d83e5bf506cd12ee8d0d4eb3e4436025/airflow/jobs/scheduler_job.py#L1256, in the _process_executor_events  method based on whether retries have been exhausted or not, if they are in QUEUED state. And the tasks for which the watcher got a "Failed" event and were in running state would be collected as zombies. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services