You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/08/17 07:32:36 UTC
[GitHub] [airflow] turbaszek commented on a change in pull request #10354: Discontinue using deprecated methods in GCSToBigQueryOperator
turbaszek commented on a change in pull request #10354:
URL: https://github.com/apache/airflow/pull/10354#discussion_r471291646
##########
File path: airflow/providers/google/cloud/transfers/gcs_to_bigquery.py
##########
@@ -258,7 +258,7 @@ def execute(self, context):
cursor = conn.cursor()
if self.external_table:
- cursor.create_external_table(
+ bq_hook.create_external_table(
Review comment:
This method is also deprecated
##########
File path: airflow/providers/google/cloud/transfers/gcs_to_bigquery.py
##########
@@ -276,7 +276,7 @@ def execute(self, context):
encryption_configuration=self.encryption_configuration
)
else:
- cursor.run_load(
+ bq_hook.insert_job(configuration=dict(
Review comment:
That won't do:
```
airflow.exceptions.AirflowException: Unknown job type. Supported types: dict_keys(['load', 'copy', 'extract', 'query'])
```
as per docstring:
```
The configuration parameter maps directly to
BigQuery's configuration field in the job object. See
https://cloud.google.com/bigquery/docs/reference/v2/jobs for
details.
```
So what I would suggest is:
- add `configuration` parameter to constructor
- if this parameter is passed then use `insert_job` method
- if not then show a deprecation warning that `Users should pass load job definition` and use the existing logic
Probably we should validate that only `load` job is passed... WDYT @edejong ?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org