You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2021/06/25 08:19:30 UTC

[GitHub] [airflow] fernhtls edited a comment on issue #9722: Airflow can't import DAG in UI and logs, but manual DAG trigger works

fernhtls edited a comment on issue #9722:
URL: https://github.com/apache/airflow/issues/9722#issuecomment-868308697


   I'm seeing a similar behaviour, but on checking the scheduler / parse logs i see the following:
   
   ```
   Traceback (most recent call last):
     File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/dagbag.py", line 317, in _load_modules_from_file
       loader.exec_module(new_module)
     File "<frozen importlib._bootstrap_external>", line 848, in exec_module
     File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
     File "/opt/airflow/dags/repo/<git_sub_path>/workflows/<path>/dag_file.py", line 14, in <module>
       from custom_operators.<operator_something>  import <ClassOperatorSomething>
   ModuleNotFoundError: No module named 'custom_operators'
   ```
   
   **ps: have redacted a few pieces of the paths and file names**
   
   **So we have some custom operators on a different path then the dags, but still below the dag bag directory.**
   
   Have tried to push a PYTHONPATH env var and it didn't help, but manually in the prompt only with python it works fine now with the PYTHONPATH env var.
   
   We are using **DAG Seralization** with the following parameters:
   
   ```
   store_dag_code = True
   min_serialized_dag_update_interval = 30
   min_serialized_dag_fetch_interval = 10
   max_num_rendered_ti_fields_per_task = 30
   ```
   
   We are passing the usual gitsync argument to the helm chart, plus putting the path to the submodule with `dags.gitSync.subPath`, and dag persistence is turned off.
   
   Airflow version 2.1.0, and we are using the apache-airflow helm chart.
   
   * Would it really be a matter of the PYTHONPATH not being set correctly for running the parser in our case?
     * The log above keeps showing that the dag import / parsing is not able to import the package, even though I see in `airflow info` on `python_path` the whole dag bag path, but when doing manually now with the PYTHONPATH set, in a python console I can import the package without any problems.
     


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org