You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "Laura Lorenz (JIRA)" <ji...@apache.org> on 2016/10/11 19:15:21 UTC

[jira] [Commented] (AIRFLOW-432) Failure in importing DAG

    [ https://issues.apache.org/jira/browse/AIRFLOW-432?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15566311#comment-15566311 ] 

Laura Lorenz commented on AIRFLOW-432:
--------------------------------------

Where is your operator file (seems to be {{operators.py}}) in relation to your DAG definition file (seems to be {{dags.py}}), and where did you configure your {{airflow.cfg}}'s {{dag_folder}} attribute to resolve to?

> Failure in importing DAG
> ------------------------
>
>                 Key: AIRFLOW-432
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-432
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DagRun
>            Reporter: Shubham
>
> Hello,
> I have a simple DAG file
> from airflow import DAG
> #Dag for bulk decline of leads
> provider_bulk_decline_dag = DAG('provider_bulk_decline_dag', default_args=dashboard_args, schedule_interval='30 1,13 * * *')
> #Dag for wake up IVE
> wakeup_ivr_dag = DAG('wakeup_ivr_dag', default_args=dashboard_args, schedule_interval='00 3 * * *')
> #Dag for monetization automatic refund
> monetization_refund_dag = DAG('monetization_refund_dag', default_args=dashboard_args, schedule_interval='10 17 * * *')
> #Dag for monetization automatic refund
> edit_history_dag = DAG('edit_history_dag', default_args=dashboard_args, schedule_interval='30 18 * * *')
> My operator file looks like this:
> from airflow import DAG
> from dags import dashboard_hourly_dag
> from dags import credit_sms_dag
> from dags import hourly_dag
> from dags import daily_sms_dag
> from dags import edit_history_dag
> from airflow.operators import BashOperator, DummyOperator, PythonOperator, BranchPythonOperator
> editHistoryReportTask = PythonOperator(
>     task_id='edit_history_report',
>     provide_context=False,
>     python_callable=edit_history_report_aggregator,
>     op_kwargs={'env': 'PROD'},
>     dag=edit_history_dag)
> But when I start the webserver, it keeps on giving me this error:
> Broken DAG: [/home/ubuntu/airflow/operators.py] cannot import name edit_history_dag
> While the command aiflow list_dags displays the above dag.
> The error in log file is:
> ERROR - Failed to import: /home/ubuntu/airflow/operators.py
> Traceback (most recent call last):
>   File "/usr/local/lib/python2.7/dist-packages/airflow/models.py", line 247, in process_file
>     m = imp.load_source(mod_name, filepath)
>   File "/home/ubuntu/airflow/operators.py", line 6, in <module>
>     from dags import edit_history_dag
> ImportError: cannot import name edit_history_dag
> Please help with the issue coz it reoccurs whenever I add a new dag.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)