You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2019/08/05 20:16:00 UTC

[jira] [Commented] (AIRFLOW-5114) TypeError when running S3ToGoogleCloudStorageTransferOperator or GoogleCloudStorageToGoogleCloudStorageTransferOperator with default arguments

    [ https://issues.apache.org/jira/browse/AIRFLOW-5114?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16900387#comment-16900387 ] 

ASF GitHub Bot commented on AIRFLOW-5114:
-----------------------------------------

TV4Fun commented on pull request #5727: [AIRFLOW-5114] Fix gcp_transfer_hook behavior with default operator arguments
URL: https://github.com/apache/airflow/pull/5727
 
 
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ X ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
     - https://issues.apache.org/jira/browse/AIRFLOW-5114
     - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
     - In case you are proposing a fundamental code change, you need to create an Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)).
     - In case you are adding a dependency, check if the license complies with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   
   ### Description
   
   - [ X ] Here are some details about my PR, including screenshots of any UI changes:
   
   `GCPTransferServiceHook.wait_for_transfer_job` defeaults its `timeout`
   parameter to 60 and assumes it is an integer or at least comparable to
   one. This is a problem as some of the built-in operators that use it
   like `S3ToGoogleCloudStorageTransferOperator` and
   `GoogleCloudStorageToGoogleCloudStorageTransferOperator` default their
   `timeout` param to `None`, and when they call this method with their
   default value, it causes an error. Fix this by allowing
   `wait_for_transfer_job` to accept a timeout of `None` and fill in
   appropriate defaults. This also adds functionality to allow it to take
   a `timedelta` instead of an integer, allows seconds to be any real, as
   there is really no need for them to actually be an integer, and fixes
   the counting of time for determining timeout to be a bit more accurate.
   
   ### Tests
   
   - [ X ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason:
   From the point of view of `GCPTransferServiceHook`, this is a purely internal change that doesn't affect any default behavior except allowing a parameter to be `None`. Since `None` is now the default, that particular case is already covered by the existing unit tests. In principle this could be tested from `S3ToGoogleCloudStorageTransferOperator` or `GoogleCloudStorageToGoogleCloudStorageTransferOperator`, but that would require giving them an actual `GCPTransferServiceHook` instead of a Mock. I have tested this by running it on a Cloud Composer cluster and using it to transfer files from S3 to GCS using `S3ToGoogleCloudStorageTransferOperator`. I have not tested `GoogleCloudStorageToGoogleCloudStorageTransferOperator` in a similar way, so maybe someone should do that as I don't think anyone ever actually tested it before releasing it. This combined with #5726 allows `S3ToGoogleCloudStorageTransferOperator` to actually run correctly with default arguments for scheduling and timeout.
   
   ### Commits
   
   - [ X ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)":
     1. Subject is separated from body by a blank line
     1. Subject is limited to 50 characters (not including Jira issue reference)
     1. Subject does not end with a period
     1. Subject uses the imperative mood ("add", not "adding")
     1. Body wraps at 72 characters
     1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ X ] In case of new functionality, my PR adds documentation that describes how to use it.
     - All the public functions and the classes in the PR contain docstrings that explain what it does
     - If you implement backwards incompatible changes, please leave a note in the [Updating.md](https://github.com/apache/airflow/blob/master/UPDATING.md) so we can assign it to a appropriate release
   I have updated the docstrings of `GCPTransferServiceHook`, `S3ToGoogleCloudStorageTransferOperator` and
   `GoogleCloudStorageToGoogleCloudStorageTransferOperator` to document their previously undocumented default behaviors and to reflect the relaxed type requirements.
   
   ### Code Quality
   
   - [ X ] Passes `flake8`
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


> TypeError when running S3ToGoogleCloudStorageTransferOperator or GoogleCloudStorageToGoogleCloudStorageTransferOperator with default arguments
> ----------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: AIRFLOW-5114
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5114
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: contrib, gcp
>    Affects Versions: 1.10.3
>            Reporter: Joel Croteau
>            Assignee: Joel Croteau
>            Priority: Major
>
> When running `S3ToGoogleCloudStorageTransferOperator` or `GoogleCloudStorageToGoogleCloudStorageTransferOperator` with default arguments, you get the following `TypeError`:
>  
> {noformat}
> [2019-08-05 04:13:19,873] {models.py:1796} ERROR - '>' not supported between instances of 'NoneType' and 'int'
> Traceback (most recent call last)
>   File "/usr/local/lib/airflow/airflow/models.py", line 1664, in _run_raw_tas
>     result = task_copy.execute(context=context
>   File "/home/airflow/gcs/dags/dependencies/gcp_transfer_operator.py", line 675, in execut
>     hook.wait_for_transfer_job(job, timeout=self.timeout
>   File "/home/airflow/gcs/dags/dependencies/gcp_api_base_hook.py", line 188, in wrapper_decorato
>     return func(self, *args, **kwargs
>   File "/home/airflow/gcs/dags/dependencies/gcp_transfer_hook.py", line 390, in wait_for_transfer_jo
>     while timeout > 0
> TypeError: '>' not supported between instances of 'NoneType' and 'int{noformat}
> This is because both operators default `timeout` to `None`, and `wait_for_transfer_job` assumes `timeout` is an integer. I have a fix I can submit.
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)