You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2019/06/10 16:56:00 UTC

[jira] [Commented] (AIRFLOW-4760) Packaged DAGs disappear from the DagBag

    [ https://issues.apache.org/jira/browse/AIRFLOW-4760?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16860147#comment-16860147 ] 

ASF GitHub Bot commented on AIRFLOW-4760:
-----------------------------------------

danmactough commented on pull request #5404: [AIRFLOW-4760] Fix package DAGs disappearing from DagBag when reloaded
URL: https://github.com/apache/airflow/pull/5404
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [x] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
     - https://issues.apache.org/jira/browse/AIRFLOW-4760
   
   ### Description
   
   - [x] Here are some details about my PR, including screenshots of any UI changes:
   
   [AIRFLOW-2900](https://github.com/apache/airflow/pull/3749) introduced a change that broke Airflow whenever it tries to reprocess a packaged DAG. Since that change, the `fileloc` column in the database for packaged DAGs isn't an actual filepath. It looks something like: `/usr/local/airflow/dags/package.zip/my_dag.py` -- **notice that the path to the DAG inside the zip is appended to the zip file's path**. Then, when the `DagBag#process_file` method tries to load that filepath, it bails because that filepath is invalid.
   
   This PR fixes that behavior by updating the `DagBag#get_dag` method such that when it fetches the DAG from the database and proceeds to reprocess the file, instead of using the `orm_dag.fileloc` property directly, we reuse the `correct_maybe_zipped` helper function to ensure that the passed value is an actual filepath.
   
   #### Before
   
   ![2019-06-07-ghosting-before](https://user-images.githubusercontent.com/357481/59211651-546bc580-8b7e-11e9-9a6a-fa7ee56e65de.gif)
   
   #### After
   
   ![2019-06-07-ghosting-after](https://user-images.githubusercontent.com/357481/59211737-9f85d880-8b7e-11e9-877a-98c1f729d186.gif)
   
   ### Tests
   
   - [x] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason:
   
     - add a sanity check test to ensure that the `fileloc` property does not change for packaged DAGs
     - add a test to verify that packaged DAGs can be refreshed
     - add a test that plain `.py` DAGs can also be refreshed (seems silly to have test coverage for only packaged DAGs 😄)
   
   ### Commits
   
   - [x] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)":
     1. Subject is separated from body by a blank line
     1. Subject is limited to 50 characters (not including Jira issue reference)
     1. Subject does not end with a period
     1. Subject uses the imperative mood ("add", not "adding")
     1. Body wraps at 72 characters
     1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [x] In case of new functionality, my PR adds documentation that describes how to use it.
     - All the public functions and the classes in the PR contain docstrings that explain what it does
     - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ ] Passes `flake8`
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> Packaged DAGs disappear from the DagBag
> ---------------------------------------
>
>                 Key: AIRFLOW-4760
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-4760
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: models
>    Affects Versions: 1.10.1, 1.10.2, 1.10.3, 1.10.4, 2.0.0
>            Reporter: Dan MacTough
>            Assignee: Dan MacTough
>            Priority: Major
>         Attachments: 2019-06-07-ghosting-before.gif
>
>
> AIRFLOW-2900 introduced a change that broke Airflow whenever it tries to reprocess a packaged DAG. Since that change, the {{fileloc}} column in the database for packaged DAGs isn't an actual filepath. It looks something like: {{/usr/local/airflow/dags/package.zip/my_dag.py}} -- *notice that the path to the DAG inside the zip is appended to the zip file's path*. Then, when the {{DagBag#process_file}} method tries to load that filepath, it bails because that filepath is invalid.
>  
> Steps to reproduce:
>  
>  # Add a packaged DAG to your DAGS_FOLDER
>  # From the main /admin UI, click on the DAG to open it's detail view
>  # Click the "Refresh" button
>  
> What should happen:
> The detail view should refresh and display a "flash" message saying that the DAG is "fresh as a daisy."
>  
> What actually happens:
> The app redirects to the main /admin UI and displays a "flash" error message in addition to the "fresh as a daisy" message.
>  
> See animated GIF, attached.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)