You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2018/09/04 13:28:11 UTC

[GitHub] wmorris75 opened a new pull request #3842: [AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)

wmorris75 opened a new pull request #3842: [AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)
URL: https://github.com/apache/incubator-airflow/pull/3842
 
 
   add 8fit to list of companies
   
   [AIRFLOW-XXX] Add THE ICONIC to the list of orgs using Airflow
   
   Closes #3807 from ksaagariconic/patch-2
   
   [AIRFLOW-2933] Enable Codecov on Docker-CI Build (#3780)
   
   - Add missing variables and use codecov instead of coveralls.
     The issue why it wasn't working was because missing environment variables.
     The codecov library heavily depends on the environment variables in
     the CI to determine how to push the reports to codecov.
   
   - Remove the explicit passing of the variables in the `tox.ini`
     since it is already done in the `docker-compose.yml`,
     having to maintain this at two places makes it brittle.
   
   - Removed the empty Codecov yml since codecov was complaining that
     it was unable to parse it
   
   [AIRFLOW-2960] Pin boto3 to <1.8 (#3810)
   
   Boto 1.8 has been released a few days ago and they break our tests.
   
   [AIRFLOW-2957] Remove obselete sensor references
   
   [AIRFLOW-2959] Refine HTTPSensor doc (#3809)
   
   HTTP Error code other than 404,
   or Connection Refused, would fail the sensor
   itself directly (no more poking).
   
   [AIRFLOW-2961] Refactor tests.BackfillJobTest.test_backfill_examples test (#3811)
   
   Simplify this test since it takes up 15% of all the time. This is because
   every example dag, with some exclusions, are backfilled. This will put some
   pressure on the scheduler and everything. If the test just covers a couple
   of dags should be sufficient
   
   254 seconds:
   [success] 15.03% tests.BackfillJobTest.test_backfill_examples: 254.9323s
   
   [AIRFLOW-XXX] Remove residual line in Changelog (#3814)
   
   [AIRFLOW-2930] Fix celery excecutor scheduler crash (#3784)
   
   Caused by an update in PR #3740.
   execute_command.apply_async(args=command, ...)
   -command is a list of short unicode strings and the above code pass multiple
   arguments to a function defined as taking only one argument.
   -command = ["airflow", "run", "dag323",...]
   -args = command = ["airflow", "run", "dag323", ...]
   -execute_command("airflow","run","dag3s3", ...) will be error and exit.
   
   [AIRFLOW-2916] Arg `verify` for AwsHook() & S3 sensors/operators (#3764)
   
   This is useful when
   1. users want to use a different CA cert bundle than the
     one used by botocore.
   2. users want to have '--no-verify-ssl'. This is especially useful
     when we're using on-premises S3 or other implementations of
     object storage, like IBM's Cloud Object Storage.
   
   The default value here is `None`, which is also the default
   value in boto3, so that backward compatibility is ensured too.
   
   Reference:
   https://boto3.readthedocs.io/en/latest/reference/core/session.html
   
   [AIRFLOW-2709] Improve error handling in Databricks hook (#3570)
   
   * Use float for default value
   * Use status code to determine whether an error is retryable
   * Fix wrong type in assertion
   * Fix style to prevent lines from exceeding 90 characters
   * Fix wrong way of checking exception type
   
   [AIRFLOW-2854] kubernetes_pod_operator add more configuration items (#3697)
   
   * kubernetes_pod_operator add more configuration items
   * fix test_kubernetes_pod_operator test_faulty_service_account failure case
   * fix review comment issues
   * pod_operator add hostnetwork config
   * add doc example
   
   [AIRFLOW-2994] Fix command status check in Qubole Check operator (#3790)
   
   [AIRFLOW-2928] Use uuid4 instead of uuid1 (#3779)
   
   for better randomness.
   
   [AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)
   
   [AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)
   
   [AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)
   
   [AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)
   
   [AIRFLOW-2993] s3_to_sftp and sftp_to_s3 operators (#3828)
   
   [AIRFLOW-2993] Added sftp_to_s3 and s3_to_sftp operators (#3828)
   
   [AIRFLOW-2993] Added sftp_to_s3 and s3_to_sftp operators (#3828)
   
   [AIRFLOW-2949] Add syntax highlight for single quote strings (#3795)
   
   * AIRFLOW-2949: Add syntax highlight for single quote strings
   
   * AIRFLOW-2949: Also updated new UI main.css
   
   [AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (#3793)
   
   There may be different combinations of arguments, and
   some processings are being done 'silently', while users
   may not be fully aware of them.
   
   For example
   - User only needs to provide either `ssh_hook`
     or `ssh_conn_id`, while this is not clear in doc
   - if both provided, `ssh_conn_id` will be ignored.
   - if `remote_host` is provided, it will replace
     the `remote_host` which wasndefined in `ssh_hook`
     or predefined in the connection of `ssh_conn_id`
   
   These should be documented clearly to ensure it's
   transparent to the users. log.info() should also be
   used to remind users and provide clear logs.
   
   In addition, add instance check for ssh_hook to ensure
   it is of the correct type (SSHHook).
   
   Tests are updated for this PR.
   
   [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md
   
   [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference
   
   [AIRFLOW-2984] Convert operator dates to UTC (#3822)
   
   Tasks can have start_dates or end_dates separately
   from the DAG. These need to be converted to UTC otherwise
   we cannot use them for calculation the next execution
   date.
   
   [AIRFLOW-2779] Make GHE auth third party licensed (#3803)
   
   This reinstates the original license.
   
   [AIRFLOW-XXX] Add Format to list of companies (#3824)
   
   [AIRFLOW-2993] Added sftp_to_s3 and s3_to_sftp operators (#3828)
   
   [AIRFLOW-2993] Added sftp_to_s3 and s3_to_sftp operators (#3828)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   [AIRFLOW-2900] Show code for packaged DAGs (#3749)
   
   [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)
   
   [AIRFLOW-2989] Add param to set bootDiskType in Dataproc Op (#3825)
   
   Add param to set bootDiskType for master and
   worker nodes in `DataprocClusterCreateOperator`
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-2993] Fix Docstrings for Operators (#3828)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-2993] Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-1762] Implement key_file support in ssh_hook create_tunnel
   
   Switched to using sshtunnel package instead of
   popen approach
   
   Closes #3473 from NielsZeilemaker/ssh_hook
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   [AIRFLOW-2993] sftp_to_s3 and s3_to_sftp Operators (#3828)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-2993] Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-1762] Implement key_file support in ssh_hook create_tunnel
   
   Switched to using sshtunnel package instead of
   popen approach
   
   Closes #3473 from NielsZeilemaker/ssh_hook
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-2993] Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-1762] Implement key_file support in ssh_hook create_tunnel
   
   Switched to using sshtunnel package instead of
   popen approach
   
   Closes #3473 from NielsZeilemaker/ssh_hook
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-2993] Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-1762] Implement key_file support in ssh_hook create_tunnel
   
   Switched to using sshtunnel package instead of
   popen approach
   
   Closes #3473 from NielsZeilemaker/ssh_hook
   
   [AIRFLOW-2900] Show code for packaged DAGs (#3749)
   
   [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)
   
   [AIRFLOW-2989] Add param to set bootDiskType in Dataproc Op (#3825)
   
   Add param to set bootDiskType for master and
   worker nodes in `DataprocClusterCreateOperator`
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-2993] Fix Docstrings for Operators (#3828)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-2993] Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-1762] Implement key_file support in ssh_hook create_tunnel
   
   Switched to using sshtunnel package instead of
   popen approach
   
   Closes #3473 from NielsZeilemaker/ssh_hook
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   [AIRFLOW-2993] sftp_to_s3 and s3_to_sftp Operators (#3828)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-2993] Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-1762] Implement key_file support in ssh_hook create_tunnel
   
   Switched to using sshtunnel package instead of
   popen approach
   
   Closes #3473 from NielsZeilemaker/ssh_hook
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-2993] Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-1762] Implement key_file support in ssh_hook create_tunnel
   
   Switched to using sshtunnel package instead of
   popen approach
   
   Closes #3473 from NielsZeilemaker/ssh_hook
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-2993] Addition of s3_to_sftp and sftp_to_s3 operators.
   
   Add 'steps' into template_fields in EmrAddSteps
   
   Rendering templates which are in steps is especially useful if you
   want to pass execution time as one of the paramaters of a step in
   an EMR cluster. All fields in template_fields will get rendered.
   
   [AIRFLOW-1762] Implement key_file support in ssh_hook create_tunnel
   
   Switched to using sshtunnel package instead of
   popen approach
   
   Closes #3473 from NielsZeilemaker/ssh_hook
   
   Addition of s3_to_sftp and sftp_to_s3 operators.
   
   [AIRFLOW-2993] Renamed operators to meet name length requirements.
   
   [AIRFLOW-2993] Renamed operators to meet name length requirements (#3828)
   
   [AIRFLOW-2993] Corrected flake8 line diff format (#3828)
   
   [AIRFLOW-2993] Corrected flake8 line diff format (#3828)
   
   [AIRFLOW-2900] Show code for packaged DAGs (#3749)
   
   [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   [AIRFLOW-2993] Corrected flake8 line diff format (#3828)
   
   [AIRFLOW-2949] Add syntax highlight for single quote strings (#3795)
   
   * AIRFLOW-2949: Add syntax highlight for single quote strings
   
   * AIRFLOW-2949: Also updated new UI main.css
   
   [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md
   
   [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference
   
   [AIRFLOW-2984] Convert operator dates to UTC (#3822)
   
   Tasks can have start_dates or end_dates separately
   from the DAG. These need to be converted to UTC otherwise
   we cannot use them for calculation the next execution
   date.
   
   [AIRFLOW-2779] Make GHE auth third party licensed (#3803)
   
   This reinstates the original license.
   
   [AIRFLOW-XXX] Add Format to list of companies (#3824)
   
   [AIRFLOW-2900] Show code for packaged DAGs (#3749)
   
   [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   [AIRFLOW-2949] Add syntax highlight for single quote strings (#3795)
   
   * AIRFLOW-2949: Add syntax highlight for single quote strings
   
   * AIRFLOW-2949: Also updated new UI main.css
   
   [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md
   
   [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference
   
   [AIRFLOW-2984] Convert operator dates to UTC (#3822)
   
   Tasks can have start_dates or end_dates separately
   from the DAG. These need to be converted to UTC otherwise
   we cannot use them for calculation the next execution
   date.
   
   [AIRFLOW-2779] Make GHE auth third party licensed (#3803)
   
   This reinstates the original license.
   
   [AIRFLOW-XXX] Add Format to list of companies (#3824)
   
   [AIRFLOW-2900] Show code for packaged DAGs (#3749)
   
   [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   [AIRFLOW-2949] Add syntax highlight for single quote strings (#3795)
   
   * AIRFLOW-2949: Add syntax highlight for single quote strings
   
   * AIRFLOW-2949: Also updated new UI main.css
   
   [AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (#3793)
   
   There may be different combinations of arguments, and
   some processings are being done 'silently', while users
   may not be fully aware of them.
   
   For example
   - User only needs to provide either `ssh_hook`
     or `ssh_conn_id`, while this is not clear in doc
   - if both provided, `ssh_conn_id` will be ignored.
   - if `remote_host` is provided, it will replace
     the `remote_host` which wasndefined in `ssh_hook`
     or predefined in the connection of `ssh_conn_id`
   
   These should be documented clearly to ensure it's
   transparent to the users. log.info() should also be
   used to remind users and provide clear logs.
   
   In addition, add instance check for ssh_hook to ensure
   it is of the correct type (SSHHook).
   
   Tests are updated for this PR.
   
   [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md
   
   [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference
   
   [AIRFLOW-2984] Convert operator dates to UTC (#3822)
   
   Tasks can have start_dates or end_dates separately
   from the DAG. These need to be converted to UTC otherwise
   we cannot use them for calculation the next execution
   date.
   
   [AIRFLOW-2779] Make GHE auth third party licensed (#3803)
   
   This reinstates the original license.
   
   [AIRFLOW-XXX] Add Format to list of companies (#3824)
   
   [AIRFLOW-2900] Show code for packaged DAGs (#3749)
   
   [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)
   
   [AIRFLOW-2989] Add param to set bootDiskType in Dataproc Op (#3825)
   
   Add param to set bootDiskType for master and
   worker nodes in `DataprocClusterCreateOperator`
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   [AIRFLOW-2949] Add syntax highlight for single quote strings (#3795)
   
   * AIRFLOW-2949: Add syntax highlight for single quote strings
   
   * AIRFLOW-2949: Also updated new UI main.css
   
   [AIRFLOW-2948] Arg check & better doc - SSHOperator & SFTPOperator (#3793)
   
   There may be different combinations of arguments, and
   some processings are being done 'silently', while users
   may not be fully aware of them.
   
   For example
   - User only needs to provide either `ssh_hook`
     or `ssh_conn_id`, while this is not clear in doc
   - if both provided, `ssh_conn_id` will be ignored.
   - if `remote_host` is provided, it will replace
     the `remote_host` which wasndefined in `ssh_hook`
     or predefined in the connection of `ssh_conn_id`
   
   These should be documented clearly to ensure it's
   transparent to the users. log.info() should also be
   used to remind users and provide clear logs.
   
   In addition, add instance check for ssh_hook to ensure
   it is of the correct type (SSHHook).
   
   Tests are updated for this PR.
   
   [AIRFLOW-XXX] Fix Broken Link in CONTRIBUTING.md
   
   [AIRFLOW-2980] ReadTheDocs - Fix Missing API Reference
   
   [AIRFLOW-2984] Convert operator dates to UTC (#3822)
   
   Tasks can have start_dates or end_dates separately
   from the DAG. These need to be converted to UTC otherwise
   we cannot use them for calculation the next execution
   date.
   
   [AIRFLOW-2779] Make GHE auth third party licensed (#3803)
   
   This reinstates the original license.
   
   [AIRFLOW-XXX] Add Format to list of companies (#3824)
   
   [AIRFLOW-2900] Show code for packaged DAGs (#3749)
   
   [AIRFLOW-2983] Add prev_ds_nodash and next_ds_nodash macro (#3821)
   
   [AIRFLOW-2974] Extended Databricks hook with clusters operation (#3817)
   
   Add hooks for:
   - cluster start,
   - restart,
   - terminate.
   Add unit tests for the added hooks.
   Add hooks for cluster start, restart and terminate.
   Add unit tests for the added hooks.
   Add cluster_id variable for performing cluster operation tests.
   
   [AIRFLOW-XXX] Fix Docstrings for Operators (#3820)
   
   [AIRFLOW-2994] Fix flatten_results for BigQueryOperator (#3829)
   
   [AIRFLOW-2951] Update dag_run table end_date when state change (#3798)
   
   The existing airflow only change dag_run table end_date value when
   a user teminate a dag in web UI. The end_date will not be updated
   if airflow detected a dag finished and updated its state.
   
   This commit add end_date update in DagRun's set_state function to
   make up tho problem mentioned above.
   
   [AIRFLOW-2145] fix deadlock on clearing running TI (#3657)
   
   a `shutdown` task is not considered be `unfinished`, so a dag run can
   deadlock when all `unfinished` downstreams are all waiting on a task
   that's in the `shutdown` state. fix this by considering `shutdown` to
   be `unfinished`, since it's not truly a terminal state
   
   [AIRFLOW-2981] Fix TypeError in dataflow operators (#3831)
   
   - Fix TypeError in dataflow operators when using GCS jar or py_file
   
   [AIRFLOW-XXX] Fix typo in docstring of gcs_to_bq (#3833)
   
   [AIRFLOW-2476] Allow tabulate up to 0.8.2 (#3835)
   
   [AIRFLOW-XXX] Fix typos in faq.rst (#3837)
   
   [AIRFLOW-2979] Make celery_result_backend conf Backwards compatible (#3832)
   
   (#2806) Renamed `celery_result_backend` to `result_backend` and broke backwards compatibility.
   
   [AIRFLOW-2866] Fix missing CSRF token head when using RBAC UI (#3804)
   
   [AIRFLOW-491] Add feature to pass extra api configs to BQ Hook (#3733)
   
   [AIRFLOW-208] Add badge to show supported Python versions (#3839)
   
   [AIRFLOW-2993] Added sftp_to_s3 operator and s3_to_sftp operator. (#3828)
   
   Make sure you have checked _all_ steps below.
   
   ### Jira
   
   - [ ] My PR addresses the following [Airflow Jira](https://issues.apache.org/jira/browse/AIRFLOW/) issues and references them in the PR title. For example, "\[AIRFLOW-XXX\] My Airflow PR"
     - https://issues.apache.org/jira/browse/AIRFLOW-XXX
     - In case you are fixing a typo in the documentation you can prepend your commit with \[AIRFLOW-XXX\], code changes always need a Jira issue.
   
   ### Description
   
   - [ ] Here are some details about my PR, including screenshots of any UI changes:
   
   ### Tests
   
   - [ ] My PR adds the following unit tests __OR__ does not need testing for this extremely good reason:
   
   ### Commits
   
   - [ ] My commits all reference Jira issues in their subject lines, and I have squashed multiple commits if they address the same issue. In addition, my commits follow the guidelines from "[How to write a good git commit message](http://chris.beams.io/posts/git-commit/)":
     1. Subject is separated from body by a blank line
     1. Subject is limited to 50 characters (not including Jira issue reference)
     1. Subject does not end with a period
     1. Subject uses the imperative mood ("add", not "adding")
     1. Body wraps at 72 characters
     1. Body explains "what" and "why", not "how"
   
   ### Documentation
   
   - [ ] In case of new functionality, my PR adds documentation that describes how to use it.
     - When adding new operators/hooks/sensors, the autoclass documentation generation needs to be added.
   
   ### Code Quality
   
   - [ ] Passes `git diff upstream/master -u -- "*.py" | flake8 --diff`
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services