You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by Jason Chen <ch...@gmail.com> on 2017/06/04 00:29:44 UTC

Quick questions on Airflow with CeleryExecutor

Hi Airflow team,

 (1) I am running airflow v1.7.1.3 with CeleryExecutor

 (2) I have two instances running with same Redis queue:
(a) Instance1: is running webserver, scheduler, worker and flower
(b) Instance2: is running webserver, worker and flower

  (3) In instance2, I noticed there is something run the same time as
below. Is it reasonable ? It looks both of them are using "airflow run" for
same DAG and task, one is with "--local" and the another is with "--job_id"
(a) /usr/bin/python27 /usr/local/bin/airflow run dag_name task_name
2017-06-03T17:15:00 --local -sd DAGS_FOLDER/dag_name.py
(b) /usr/bin/python27 /usr/local/bin/airflow run dag_name task_name
2017-06-03T17:15:00 --job_id 299966 --raw -sd DAGS_FOLDER/dag_name.py"

(4) What's airflow worker is doing here ? Although I started it fine
("airflow worker..."), it's not clear how workers play the role ? Is it
communicating with Airflow celery queue? What's the relationship between
"scheduler"  and "worker" ?

Thanks.
-Jason