You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2019/08/05 19:39:58 UTC

[GitHub] [airflow] TV4Fun opened a new pull request #5726: [AIRFLOW-5104] Set default schedule for GCP Transfer operators

TV4Fun opened a new pull request #5726: [AIRFLOW-5104] Set default schedule for GCP Transfer operators
URL: https://github.com/apache/airflow/pull/5726
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ X ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
     - https://issues.apache.org/jira/browse/AIRFLOW-5104
     - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
     - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
     - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ X ] Here are some details about my PR, including screenshots of any UI changes:
   
   The GCS Transfer Service REST API requires that a schedule be set, even for
   one-time immediate runs. This adds code to
   `S3ToGoogleCloudStorageTransferOperator` and
   `GoogleCloudStorageToGoogleCloudStorageTransferOperator` to set a default
   one-time immediate run schedule when no `schedule` argument is passed.
   
   ### Tests
   
   - [ X ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason:
   This should just be fixing existing behavior, and is hard to test as it only produces an error when actually sent to the GCP API. I have tested it by running it on a Cloud Composer cluster and using it to transfer files from S3 to GCS using `S3ToGoogleCloudStorageTransferOperator`. I have not tested `GoogleCloudStorageToGoogleCloudStorageTransferOperator` in a similar way, so maybe someone should do that as I don't think anyone ever actually tested it before releasing it. This combined with the fix I am about to submit to AIRFLOW-5114 allows `S3ToGoogleCloudStorageTransferOperator` to run correctly with default arguments for scheduling and timeout.
   
   ### Commits
   
   - [ x ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)":
     1. Subject is separated from body by a blank line
     1. Subject is limited to 50 characters (not including Jira issue reference)
     1. Subject does not end with a period
     1. Subject uses the imperative mood ("add", not "adding")
     1. Body wraps at 72 characters
     1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ X ] In case of new functionality, my PR adds documentation that describes how to use it.
     - All the public functions and the classes in the PR contain docstrings that explain what it does
     - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release
   
   ### Code Quality
   
   - [ X ] Passes `flake8`
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services