You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@airflow.apache.org by Kerr Shireman <ha...@gmail.com> on 2017/01/24 20:58:49 UTC

Airflow 1.7.1.3 on Mac OS X Sierra - scheduler not triggering DAGs

Hi all,

My shop is using Airflow 1.7.1.3 for job scheduling.  For our developers,
I'm installing Airflow on MacBook Pros running Mac OS X Sierra (10.12.1).

Some details:

   - Using stock Python interpreter 2.7.10
   - Airflow is installed in a clean virtualenv using the basic tutorial
   <https://airflow.incubator.apache.org/installation.html>
   - Install succeeds without issue
   - Webserver runs without issue
   - Externally triggered DAGs run without issue

Scheduler issue:

   - Airflow scheduler finds jobs but it appears they are not queued, nor
   executed.
   - Test DAG: example_branch_dop_operator_v3 from Airflow provided examples
   - Scheduler log shows as follows, with the scheduler loop completing
   while not checking for the state of a DagRun:

[2017-01-24 14:28:38,056] {jobs.py:247} ERROR - Cannot use more than 1
thread when using sqlite. Setting max_threads to 1
[2017-01-24 14:28:38,065] {jobs.py:680} INFO - Starting the scheduler
[2017-01-24 14:28:38,066] {models.py:154} INFO - Filling up the DagBag from
/Users/username/airflow/dags
[2017-01-24 14:28:38,153] {jobs.py:574} INFO - Prioritizing 0 queued jobs
[2017-01-24 14:28:53,076] {jobs.py:726} INFO - Starting 1 scheduler jobs
[2017-01-24 14:28:53,084] {jobs.py:741} INFO - Done queuing tasks, calling
the executor's heartbeat
[2017-01-24 14:28:53,084] {jobs.py:744} INFO - Loop took: 0.019503 seconds
[2017-01-24 14:28:53,088] {models.py:305} INFO - Finding 'running' jobs
without a recent heartbeat
[2017-01-24 14:28:53,089] {models.py:311} INFO - Failing jobs without
heartbeat after 2017-01-24 14:26:38.089514
[2017-01-24 14:28:58,075] {jobs.py:574} INFO - Prioritizing 0 queued jobs
[2017-01-24 14:28:58,083] {jobs.py:726} INFO - Starting 1 scheduler jobs
[2017-01-24 14:28:58,092] {jobs.py:741} INFO - Done queuing tasks, calling
the executor's heartbeat
[2017-01-24 14:28:58,092] {jobs.py:744} INFO - Loop took: 0.020154 seconds

Scheduler runs on older OS X build:

   - This issue does not manifest on OS X El Capitan (10.11.1)

My own machine runs OS X El Capitan (10.11.1) c.a. mid-2015, and my
scheduler log shows the job same job being queued and executed.  I've
*italicized * the lines missing in the above log:

[2017-01-24 14:39:55,843] {jobs.py:247} ERROR - Cannot use more than 1
thread when using sqlite. Setting max_threads to 1
[2017-01-24 14:39:55,853] {jobs.py:680} INFO - Starting the scheduler
[2017-01-24 14:39:55,853] {models.py:154} INFO - Filling up the DagBag from
/Users/username/airflow/dags
[2017-01-24 14:39:55,944] {jobs.py:574} INFO - Prioritizing 0 queued jobs
[2017-01-24 14:39:55,963] {jobs.py:726} INFO - Starting 1 scheduler jobs
*[2017-01-24 14:39:55,998] {models.py:2660} INFO - Checking state for
<DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:39:00:
scheduled__2017-01-21T03:39:00, externally triggered: False>*
*[2017-01-24 14:39:55,999] {models.py:2660} INFO - Checking state for
<DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:40:00:
scheduled__2017-01-21T03:40:00, externally triggered: False>*
*[2017-01-24 14:39:55,999] {models.py:2660} INFO - Checking state for
<DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:41:00:
scheduled__2017-01-21T03:41:00, externally triggered: False>*
*[2017-01-24 14:39:55,999] {jobs.py:498} INFO - Getting list of tasks to
skip for active runs.*
*[2017-01-24 14:39:56,002] {jobs.py:514} INFO - Checking dependencies on 9
tasks instances, minus 2 skippable ones*
*[2017-01-24 14:39:56,015] {base_executor.py:36} INFO - Adding to queue:
airflow run example_branch_dop_operator_v3 oper_1 2017-01-21T03:40:00
--local -sd
DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
*[2017-01-24 14:39:56,024] {base_executor.py:36} INFO - Adding to queue:
airflow run example_branch_dop_operator_v3 condition 2017-01-21T03:41:00
--local -sd
DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
*[2017-01-24 14:39:56,033] {base_executor.py:36} INFO - Adding to queue:
airflow run example_branch_dop_operator_v3 oper_2 2017-01-21T03:39:00
--local -sd
DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
[2017-01-24 14:39:56,054] {jobs.py:741} INFO - Done queuing tasks, calling
the executor's heartbeat
[2017-01-24 14:39:56,054] {jobs.py:744} INFO - Loop took: 0.112856 seconds

We turned DEBUG logging on, though I didn't see anything that helped, other
than models.py never seems to check for the state of a DagRun, and
subsequently base_executor.py never triggers the DAG run.

Question:

   - Is this a known issue addressed in 1.8?
      - Note that we tested 1.8b2 with all other variables being the same,
      and this issue persists.
   - Is there something else we can do to shed light on the issue?


Thanks for your help!
Kerr Shireman - Data Scientist - AutoAlert

Re: Airflow 1.7.1.3 on Mac OS X Sierra - scheduler not triggering DAGs

Posted by Jayesh Senjaliya <jh...@gmail.com>.
Hi Kerr and Bolke,

I am hitting "scheduler not scheduling task" but I am on 1.8.b2, and i have
debugged to find the root cause.

let me send separate email though describing the issue i am facing.

Thanks
Jayesh



On Tue, Jan 24, 2017 at 2:11 PM, Kerr Shireman <ha...@gmail.com> wrote:

> Bolke,
>
> Thanks for the quick reply.  I wonder if anyone else on Sierra is running a
> local dev instance against SQLite?
>
> Thanks,
> Kerr
>
> On Tue, Jan 24, 2017 at 3:09 PM Bolke de Bruin <bd...@gmail.com> wrote:
>
> I run sierra and do my dev mostly on it. I run in a virtualenv with a local
> mariadb and don't see this behavior?
>
> Sent from my iPhone
>
> > On 24 Jan 2017, at 21:58, Kerr Shireman <ha...@gmail.com> wrote:
> >
> > Hi all,
> >
> > My shop is using Airflow 1.7.1.3 for job scheduling.  For our developers,
> > I'm installing Airflow on MacBook Pros running Mac OS X Sierra (10.12.1).
> >
> > Some details:
> >
> >   - Using stock Python interpreter 2.7.10
> >   - Airflow is installed in a clean virtualenv using the basic tutorial
> >   <https://airflow.incubator.apache.org/installation.html>
> >   - Install succeeds without issue
> >   - Webserver runs without issue
> >   - Externally triggered DAGs run without issue
> >
> > Scheduler issue:
> >
> >   - Airflow scheduler finds jobs but it appears they are not queued, nor
> >   executed.
> >   - Test DAG: example_branch_dop_operator_v3 from Airflow provided
> examples
> >   - Scheduler log shows as follows, with the scheduler loop completing
> >   while not checking for the state of a DagRun:
> >
> > [2017-01-24 14:28:38,056] {jobs.py:247} ERROR - Cannot use more than 1
> > thread when using sqlite. Setting max_threads to 1
> > [2017-01-24 14:28:38,065] {jobs.py:680} INFO - Starting the scheduler
> > [2017-01-24 14:28:38,066] {models.py:154} INFO - Filling up the DagBag
> from
> > /Users/username/airflow/dags
> > [2017-01-24 14:28:38,153] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> > [2017-01-24 14:28:53,076] {jobs.py:726} INFO - Starting 1 scheduler jobs
> > [2017-01-24 14:28:53,084] {jobs.py:741} INFO - Done queuing tasks,
> calling
> > the executor's heartbeat
> > [2017-01-24 14:28:53,084] {jobs.py:744} INFO - Loop took: 0.019503
> seconds
> > [2017-01-24 14:28:53,088] {models.py:305} INFO - Finding 'running' jobs
> > without a recent heartbeat
> > [2017-01-24 14:28:53,089] {models.py:311} INFO - Failing jobs without
> > heartbeat after 2017-01-24 14:26:38.089514
> > [2017-01-24 14:28:58,075] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> > [2017-01-24 14:28:58,083] {jobs.py:726} INFO - Starting 1 scheduler jobs
> > [2017-01-24 14:28:58,092] {jobs.py:741} INFO - Done queuing tasks,
> calling
> > the executor's heartbeat
> > [2017-01-24 14:28:58,092] {jobs.py:744} INFO - Loop took: 0.020154
> seconds
> >
> > Scheduler runs on older OS X build:
> >
> >   - This issue does not manifest on OS X El Capitan (10.11.1)
> >
> > My own machine runs OS X El Capitan (10.11.1) c.a. mid-2015, and my
> > scheduler log shows the job same job being queued and executed.  I've
> > *italicized * the lines missing in the above log:
> >
> > [2017-01-24 14:39:55,843] {jobs.py:247} ERROR - Cannot use more than 1
> > thread when using sqlite. Setting max_threads to 1
> > [2017-01-24 14:39:55,853] {jobs.py:680} INFO - Starting the scheduler
> > [2017-01-24 14:39:55,853] {models.py:154} INFO - Filling up the DagBag
> from
> > /Users/username/airflow/dags
> > [2017-01-24 14:39:55,944] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> > [2017-01-24 14:39:55,963] {jobs.py:726} INFO - Starting 1 scheduler jobs
> > *[2017-01-24 14:39:55,998] {models.py:2660} INFO - Checking state for
> > <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:39:00:
> > scheduled__2017-01-21T03:39:00, externally triggered: False>*
> > *[2017-01-24 14:39:55,999] {models.py:2660} INFO - Checking state for
> > <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:40:00:
> > scheduled__2017-01-21T03:40:00, externally triggered: False>*
> > *[2017-01-24 14:39:55,999] {models.py:2660} INFO - Checking state for
> > <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:41:00:
> > scheduled__2017-01-21T03:41:00, externally triggered: False>*
> > *[2017-01-24 14:39:55,999] {jobs.py:498} INFO - Getting list of tasks to
> > skip for active runs.*
> > *[2017-01-24 14:39:56,002] {jobs.py:514} INFO - Checking dependencies on
> 9
> > tasks instances, minus 2 skippable ones*
> > *[2017-01-24 14:39:56,015] {base_executor.py:36} INFO - Adding to queue:
> > airflow run example_branch_dop_operator_v3 oper_1 2017-01-21T03:40:00
> > --local -sd
> > DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> > *[2017-01-24 14:39:56,024] {base_executor.py:36} INFO - Adding to queue:
> > airflow run example_branch_dop_operator_v3 condition 2017-01-21T03:41:00
> > --local -sd
> > DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> > *[2017-01-24 14:39:56,033] {base_executor.py:36} INFO - Adding to queue:
> > airflow run example_branch_dop_operator_v3 oper_2 2017-01-21T03:39:00
> > --local -sd
> > DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> > [2017-01-24 14:39:56,054] {jobs.py:741} INFO - Done queuing tasks,
> calling
> > the executor's heartbeat
> > [2017-01-24 14:39:56,054] {jobs.py:744} INFO - Loop took: 0.112856
> seconds
> >
> > We turned DEBUG logging on, though I didn't see anything that helped,
> other
> > than models.py never seems to check for the state of a DagRun, and
> > subsequently base_executor.py never triggers the DAG run.
> >
> > Question:
> >
> >   - Is this a known issue addressed in 1.8?
> >      - Note that we tested 1.8b2 with all other variables being the same,
> >      and this issue persists.
> >   - Is there something else we can do to shed light on the issue?
> >
> >
> > Thanks for your help!
> > Kerr Shireman - Data Scientist - AutoAlert
>

Re: Airflow 1.7.1.3 on Mac OS X Sierra - scheduler not triggering DAGs

Posted by Kerr Shireman <ha...@gmail.com>.
Bolke,

Thanks for the quick reply.  I wonder if anyone else on Sierra is running a
local dev instance against SQLite?

Thanks,
Kerr

On Tue, Jan 24, 2017 at 3:09 PM Bolke de Bruin <bd...@gmail.com> wrote:

I run sierra and do my dev mostly on it. I run in a virtualenv with a local
mariadb and don't see this behavior?

Sent from my iPhone

> On 24 Jan 2017, at 21:58, Kerr Shireman <ha...@gmail.com> wrote:
>
> Hi all,
>
> My shop is using Airflow 1.7.1.3 for job scheduling.  For our developers,
> I'm installing Airflow on MacBook Pros running Mac OS X Sierra (10.12.1).
>
> Some details:
>
>   - Using stock Python interpreter 2.7.10
>   - Airflow is installed in a clean virtualenv using the basic tutorial
>   <https://airflow.incubator.apache.org/installation.html>
>   - Install succeeds without issue
>   - Webserver runs without issue
>   - Externally triggered DAGs run without issue
>
> Scheduler issue:
>
>   - Airflow scheduler finds jobs but it appears they are not queued, nor
>   executed.
>   - Test DAG: example_branch_dop_operator_v3 from Airflow provided
examples
>   - Scheduler log shows as follows, with the scheduler loop completing
>   while not checking for the state of a DagRun:
>
> [2017-01-24 14:28:38,056] {jobs.py:247} ERROR - Cannot use more than 1
> thread when using sqlite. Setting max_threads to 1
> [2017-01-24 14:28:38,065] {jobs.py:680} INFO - Starting the scheduler
> [2017-01-24 14:28:38,066] {models.py:154} INFO - Filling up the DagBag
from
> /Users/username/airflow/dags
> [2017-01-24 14:28:38,153] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> [2017-01-24 14:28:53,076] {jobs.py:726} INFO - Starting 1 scheduler jobs
> [2017-01-24 14:28:53,084] {jobs.py:741} INFO - Done queuing tasks, calling
> the executor's heartbeat
> [2017-01-24 14:28:53,084] {jobs.py:744} INFO - Loop took: 0.019503 seconds
> [2017-01-24 14:28:53,088] {models.py:305} INFO - Finding 'running' jobs
> without a recent heartbeat
> [2017-01-24 14:28:53,089] {models.py:311} INFO - Failing jobs without
> heartbeat after 2017-01-24 14:26:38.089514
> [2017-01-24 14:28:58,075] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> [2017-01-24 14:28:58,083] {jobs.py:726} INFO - Starting 1 scheduler jobs
> [2017-01-24 14:28:58,092] {jobs.py:741} INFO - Done queuing tasks, calling
> the executor's heartbeat
> [2017-01-24 14:28:58,092] {jobs.py:744} INFO - Loop took: 0.020154 seconds
>
> Scheduler runs on older OS X build:
>
>   - This issue does not manifest on OS X El Capitan (10.11.1)
>
> My own machine runs OS X El Capitan (10.11.1) c.a. mid-2015, and my
> scheduler log shows the job same job being queued and executed.  I've
> *italicized * the lines missing in the above log:
>
> [2017-01-24 14:39:55,843] {jobs.py:247} ERROR - Cannot use more than 1
> thread when using sqlite. Setting max_threads to 1
> [2017-01-24 14:39:55,853] {jobs.py:680} INFO - Starting the scheduler
> [2017-01-24 14:39:55,853] {models.py:154} INFO - Filling up the DagBag
from
> /Users/username/airflow/dags
> [2017-01-24 14:39:55,944] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> [2017-01-24 14:39:55,963] {jobs.py:726} INFO - Starting 1 scheduler jobs
> *[2017-01-24 14:39:55,998] {models.py:2660} INFO - Checking state for
> <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:39:00:
> scheduled__2017-01-21T03:39:00, externally triggered: False>*
> *[2017-01-24 14:39:55,999] {models.py:2660} INFO - Checking state for
> <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:40:00:
> scheduled__2017-01-21T03:40:00, externally triggered: False>*
> *[2017-01-24 14:39:55,999] {models.py:2660} INFO - Checking state for
> <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:41:00:
> scheduled__2017-01-21T03:41:00, externally triggered: False>*
> *[2017-01-24 14:39:55,999] {jobs.py:498} INFO - Getting list of tasks to
> skip for active runs.*
> *[2017-01-24 14:39:56,002] {jobs.py:514} INFO - Checking dependencies on 9
> tasks instances, minus 2 skippable ones*
> *[2017-01-24 14:39:56,015] {base_executor.py:36} INFO - Adding to queue:
> airflow run example_branch_dop_operator_v3 oper_1 2017-01-21T03:40:00
> --local -sd
> DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> *[2017-01-24 14:39:56,024] {base_executor.py:36} INFO - Adding to queue:
> airflow run example_branch_dop_operator_v3 condition 2017-01-21T03:41:00
> --local -sd
> DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> *[2017-01-24 14:39:56,033] {base_executor.py:36} INFO - Adding to queue:
> airflow run example_branch_dop_operator_v3 oper_2 2017-01-21T03:39:00
> --local -sd
> DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> [2017-01-24 14:39:56,054] {jobs.py:741} INFO - Done queuing tasks, calling
> the executor's heartbeat
> [2017-01-24 14:39:56,054] {jobs.py:744} INFO - Loop took: 0.112856 seconds
>
> We turned DEBUG logging on, though I didn't see anything that helped,
other
> than models.py never seems to check for the state of a DagRun, and
> subsequently base_executor.py never triggers the DAG run.
>
> Question:
>
>   - Is this a known issue addressed in 1.8?
>      - Note that we tested 1.8b2 with all other variables being the same,
>      and this issue persists.
>   - Is there something else we can do to shed light on the issue?
>
>
> Thanks for your help!
> Kerr Shireman - Data Scientist - AutoAlert

Re: Airflow 1.7.1.3 on Mac OS X Sierra - scheduler not triggering DAGs

Posted by Bolke de Bruin <bd...@gmail.com>.
I run sierra and do my dev mostly on it. I run in a virtualenv with a local mariadb and don't see this behavior? 

Sent from my iPhone

> On 24 Jan 2017, at 21:58, Kerr Shireman <ha...@gmail.com> wrote:
> 
> Hi all,
> 
> My shop is using Airflow 1.7.1.3 for job scheduling.  For our developers,
> I'm installing Airflow on MacBook Pros running Mac OS X Sierra (10.12.1).
> 
> Some details:
> 
>   - Using stock Python interpreter 2.7.10
>   - Airflow is installed in a clean virtualenv using the basic tutorial
>   <https://airflow.incubator.apache.org/installation.html>
>   - Install succeeds without issue
>   - Webserver runs without issue
>   - Externally triggered DAGs run without issue
> 
> Scheduler issue:
> 
>   - Airflow scheduler finds jobs but it appears they are not queued, nor
>   executed.
>   - Test DAG: example_branch_dop_operator_v3 from Airflow provided examples
>   - Scheduler log shows as follows, with the scheduler loop completing
>   while not checking for the state of a DagRun:
> 
> [2017-01-24 14:28:38,056] {jobs.py:247} ERROR - Cannot use more than 1
> thread when using sqlite. Setting max_threads to 1
> [2017-01-24 14:28:38,065] {jobs.py:680} INFO - Starting the scheduler
> [2017-01-24 14:28:38,066] {models.py:154} INFO - Filling up the DagBag from
> /Users/username/airflow/dags
> [2017-01-24 14:28:38,153] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> [2017-01-24 14:28:53,076] {jobs.py:726} INFO - Starting 1 scheduler jobs
> [2017-01-24 14:28:53,084] {jobs.py:741} INFO - Done queuing tasks, calling
> the executor's heartbeat
> [2017-01-24 14:28:53,084] {jobs.py:744} INFO - Loop took: 0.019503 seconds
> [2017-01-24 14:28:53,088] {models.py:305} INFO - Finding 'running' jobs
> without a recent heartbeat
> [2017-01-24 14:28:53,089] {models.py:311} INFO - Failing jobs without
> heartbeat after 2017-01-24 14:26:38.089514
> [2017-01-24 14:28:58,075] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> [2017-01-24 14:28:58,083] {jobs.py:726} INFO - Starting 1 scheduler jobs
> [2017-01-24 14:28:58,092] {jobs.py:741} INFO - Done queuing tasks, calling
> the executor's heartbeat
> [2017-01-24 14:28:58,092] {jobs.py:744} INFO - Loop took: 0.020154 seconds
> 
> Scheduler runs on older OS X build:
> 
>   - This issue does not manifest on OS X El Capitan (10.11.1)
> 
> My own machine runs OS X El Capitan (10.11.1) c.a. mid-2015, and my
> scheduler log shows the job same job being queued and executed.  I've
> *italicized * the lines missing in the above log:
> 
> [2017-01-24 14:39:55,843] {jobs.py:247} ERROR - Cannot use more than 1
> thread when using sqlite. Setting max_threads to 1
> [2017-01-24 14:39:55,853] {jobs.py:680} INFO - Starting the scheduler
> [2017-01-24 14:39:55,853] {models.py:154} INFO - Filling up the DagBag from
> /Users/username/airflow/dags
> [2017-01-24 14:39:55,944] {jobs.py:574} INFO - Prioritizing 0 queued jobs
> [2017-01-24 14:39:55,963] {jobs.py:726} INFO - Starting 1 scheduler jobs
> *[2017-01-24 14:39:55,998] {models.py:2660} INFO - Checking state for
> <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:39:00:
> scheduled__2017-01-21T03:39:00, externally triggered: False>*
> *[2017-01-24 14:39:55,999] {models.py:2660} INFO - Checking state for
> <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:40:00:
> scheduled__2017-01-21T03:40:00, externally triggered: False>*
> *[2017-01-24 14:39:55,999] {models.py:2660} INFO - Checking state for
> <DagRun example_branch_dop_operator_v3 @ 2017-01-21 03:41:00:
> scheduled__2017-01-21T03:41:00, externally triggered: False>*
> *[2017-01-24 14:39:55,999] {jobs.py:498} INFO - Getting list of tasks to
> skip for active runs.*
> *[2017-01-24 14:39:56,002] {jobs.py:514} INFO - Checking dependencies on 9
> tasks instances, minus 2 skippable ones*
> *[2017-01-24 14:39:56,015] {base_executor.py:36} INFO - Adding to queue:
> airflow run example_branch_dop_operator_v3 oper_1 2017-01-21T03:40:00
> --local -sd
> DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> *[2017-01-24 14:39:56,024] {base_executor.py:36} INFO - Adding to queue:
> airflow run example_branch_dop_operator_v3 condition 2017-01-21T03:41:00
> --local -sd
> DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> *[2017-01-24 14:39:56,033] {base_executor.py:36} INFO - Adding to queue:
> airflow run example_branch_dop_operator_v3 oper_2 2017-01-21T03:39:00
> --local -sd
> DAGS_FOLDER/example_dags/example_branch_python_dop_operator_3.py*
> [2017-01-24 14:39:56,054] {jobs.py:741} INFO - Done queuing tasks, calling
> the executor's heartbeat
> [2017-01-24 14:39:56,054] {jobs.py:744} INFO - Loop took: 0.112856 seconds
> 
> We turned DEBUG logging on, though I didn't see anything that helped, other
> than models.py never seems to check for the state of a DagRun, and
> subsequently base_executor.py never triggers the DAG run.
> 
> Question:
> 
>   - Is this a known issue addressed in 1.8?
>      - Note that we tested 1.8b2 with all other variables being the same,
>      and this issue persists.
>   - Is there something else we can do to shed light on the issue?
> 
> 
> Thanks for your help!
> Kerr Shireman - Data Scientist - AutoAlert