You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "jack (JIRA)" <ji...@apache.org> on 2018/12/16 14:51:00 UTC
[jira] [Commented] (AIRFLOW-3254) BigQueryGetDataOperator to
support reading query from SQL file
[ https://issues.apache.org/jira/browse/AIRFLOW-3254?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16722503#comment-16722503 ]
jack commented on AIRFLOW-3254:
-------------------------------
[~kaxilnaik] any chance you are working on this for the 1.10.2?
> BigQueryGetDataOperator to support reading query from SQL file
> --------------------------------------------------------------
>
> Key: AIRFLOW-3254
> URL: https://issues.apache.org/jira/browse/AIRFLOW-3254
> Project: Apache Airflow
> Issue Type: Improvement
> Affects Versions: 1.10.0
> Reporter: jack
> Assignee: Kaxil Naik
> Priority: Minor
>
> As discussed with [~Fokko] on Slack:
> Currently the BigQueryGetDataOperator supports only reading query provided directly as:
>
> {code:java}
> sql = 'SELECT ID FROM TABLE'
> {code}
>
> it does not support reading the query from a SQL file which can be annoying as sometimes queries are quite large.
> This behavior is supported by other operators like MySqlToGoogleCloudStorageOperator:
> dag = DAG(
> dag_id='Import',
> default_args=args,
> schedule_interval='*/5 * * * *',
> max_active_runs=1,
> catchup=False,
> template_searchpath = ['/home/.../airflow/…/sql/Import']
> )
>
> importop = MySqlToGoogleCloudStorageOperator(
> task_id='import',
> mysql_conn_id='MySQL_con',
> google_cloud_storage_conn_id='gcp_con',
> provide_context=True,
> sql = 'importop.sql',
> params=\{'table_name' : TABLE_NAME},
> bucket=GCS_BUCKET_ID,
> filename=file_name_orders,
> dag=dag)
>
> If anyone can pick it up it would be great :)
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)