You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Kamil Szkoda (JIRA)" <ji...@apache.org> on 2018/03/07 16:27:00 UTC

[jira] [Created] (AIRFLOW-2189) Scheduler under systemd doesn't work in parrarel

Kamil Szkoda created AIRFLOW-2189:
-------------------------------------

             Summary: Scheduler under systemd doesn't work in parrarel
                 Key: AIRFLOW-2189
                 URL: https://issues.apache.org/jira/browse/AIRFLOW-2189
             Project: Apache Airflow
          Issue Type: Bug
          Components: scheduler
    Affects Versions: 1.9.0
            Reporter: Kamil Szkoda


I'm using scheduler under system V. In this case all dags can run parallel.

I just migrated to System D and even though I have the same configuration I can't run dags parallel.

Just an example:
 # When one job is running 30 minutes (for example spark execution) rest of them are queueing. This problem stop execution for rest of jobs not so heavy.
 # When dag's code is failed scheduler doesn't execute rest of dags. .

I have no celery configuration. Does celery with system D resolve my problems ?

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)