You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Vasudha Putta (Jira)" <ji...@apache.org> on 2019/11/08 16:11:00 UTC

[jira] [Updated] (AIRFLOW-5871) Stopping/Clearing a running airflow instance doesnt't terminate the job actually.

     [ https://issues.apache.org/jira/browse/AIRFLOW-5871?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Vasudha Putta updated AIRFLOW-5871:
-----------------------------------
    Priority: Critical  (was: Major)

> Stopping/Clearing a running airflow instance doesnt't terminate the job actually.
> ---------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-5871
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5871
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: scheduler
>    Affects Versions: 1.10.1
>            Reporter: Vasudha Putta
>            Priority: Critical
>
> Hi Team,
> When I change the state of a running job instance to cleared/failed, it doesn't completely terminate the existing job. I tried using pythonOperator, bashOperator. The job connects to oracle and executes a package. Even terminating/killing airflow job process  won't terminate the oracle sessions. This is an issue as whenever we would need to compile the package we would have to stop the dags, marks existing dag runs to clear state and then kill the oracle sessions. Is there a way to clean stop dag runs in airflow.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)