You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/10/23 18:03:21 UTC

[GitHub] [airflow] kaxil edited a comment on issue #11788: Scheduler does not appear to be running properly in default airflow setup

kaxil edited a comment on issue #11788:
URL: https://github.com/apache/airflow/issues/11788#issuecomment-715492269


   With `./breeze start-airflow --backend postgres --db-reset --load-example-dags`
   
   Scheduler is stuck at:
   
   ```
   root@e08ca4038769:/opt/airflow# airflow scheduler
     ____________       _____________
    ____    |__( )_________  __/__  /________      __
   ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
   ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
    _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
   [2020-10-23 17:56:20,810] {scheduler_job.py:1270} INFO - Starting the scheduler
   [2020-10-23 17:56:20,810] {scheduler_job.py:1275} INFO - Processing each file at most -1 times
   [2020-10-23 17:56:20,811] {scheduler_job.py:1297} INFO - Resetting orphaned tasks for active dag runs
   [2020-10-23 17:56:20,831] {dag_processing.py:250} INFO - Launched DagFileProcessorManager with pid: 372
   [2020-10-23 17:56:20,850] {settings.py:49} INFO - Configured default timezone Timezone('UTC')
   [2020-10-23 17:59:55,167] {scheduler_job.py:976} INFO - 4 tasks up for execution:
           <TaskInstance: example_bash_operator.runme_0 2020-10-21 00:00:00+00:00 [scheduled]>
           <TaskInstance: example_bash_operator.runme_1 2020-10-21 00:00:00+00:00 [scheduled]>
           <TaskInstance: example_bash_operator.runme_2 2020-10-21 00:00:00+00:00 [scheduled]>
           <TaskInstance: example_bash_operator.also_run_this 2020-10-21 00:00:00+00:00 [scheduled]>
   [2020-10-23 17:59:55,171] {scheduler_job.py:1011} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 4 task instances ready to be queued
   [2020-10-23 17:59:55,171] {scheduler_job.py:1038} INFO - DAG example_bash_operator has 0/16 running and queued tasks
   [2020-10-23 17:59:55,171] {scheduler_job.py:1038} INFO - DAG example_bash_operator has 1/16 running and queued tasks
   [2020-10-23 17:59:55,171] {scheduler_job.py:1038} INFO - DAG example_bash_operator has 2/16 running and queued tasks
   [2020-10-23 17:59:55,171] {scheduler_job.py:1038} INFO - DAG example_bash_operator has 3/16 running and queued tasks
   [2020-10-23 17:59:55,172] {scheduler_job.py:1090} INFO - Setting the following tasks to queued state:
           <TaskInstance: example_bash_operator.runme_0 2020-10-21 00:00:00+00:00 [scheduled]>
           <TaskInstance: example_bash_operator.runme_1 2020-10-21 00:00:00+00:00 [scheduled]>
           <TaskInstance: example_bash_operator.runme_2 2020-10-21 00:00:00+00:00 [scheduled]>
           <TaskInstance: example_bash_operator.also_run_this 2020-10-21 00:00:00+00:00 [scheduled]>
   [2020-10-23 17:59:55,176] {scheduler_job.py:1137} INFO - Sending TaskInstanceKey(dag_id='example_bash_operator', task_id='runme_0', execution_date=datetime.datetime(2020, 10, 21, 0, 0, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 3 and queue default
   [2020-10-23 17:59:55,176] {base_executor.py:78} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_0', '2020-10-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/airflow/example_dags/example_bash_operator.py']
   [2020-10-23 17:59:55,177] {scheduler_job.py:1137} INFO - Sending TaskInstanceKey(dag_id='example_bash_operator', task_id='runme_1', execution_date=datetime.datetime(2020, 10, 21, 0, 0, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 3 and queue default
   [2020-10-23 17:59:55,177] {base_executor.py:78} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_1', '2020-10-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/airflow/example_dags/example_bash_operator.py']
   [2020-10-23 17:59:55,178] {scheduler_job.py:1137} INFO - Sending TaskInstanceKey(dag_id='example_bash_operator', task_id='runme_2', execution_date=datetime.datetime(2020, 10, 21, 0, 0, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 3 and queue default
   [2020-10-23 17:59:55,178] {base_executor.py:78} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_2', '2020-10-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/airflow/example_dags/example_bash_operator.py']
   [2020-10-23 17:59:55,178] {scheduler_job.py:1137} INFO - Sending TaskInstanceKey(dag_id='example_bash_operator', task_id='also_run_this', execution_date=datetime.datetime(2020, 10, 21, 0, 0, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 2 and queue default
   [2020-10-23 17:59:55,179] {base_executor.py:78} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'example_bash_operator', 'also_run_this', '2020-10-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/airflow/example_dags/example_bash_operator.py']
   [2020-10-23 17:59:55,179] {sequential_executor.py:57} INFO - Executing command: ['airflow', 'tasks', 'run', 'example_bash_operator', 'runme_0', '2020-10-21T00:00:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/opt/airflow/airflow/example_dags/example_bash_operator.py']
   [2020-10-23 17:59:59,527] {dagbag.py:436} INFO - Filling up the DagBag from /opt/airflow/airflow/example_dags/example_bash_operator.py
   Running <TaskInstance: example_bash_operator.runme_0 2020-10-21T00:00:00+00:00 [scheduled]> on host e08ca4038769
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org