You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/10/12 13:59:55 UTC

[GitHub] [airflow] toshitanian commented on issue #11469: BaseSQLToGoogleCloudStorageOperator does not upload complete dump file to GCS

toshitanian commented on issue #11469:
URL: https://github.com/apache/airflow/issues/11469#issuecomment-707138442


   It seems that I had misunderstanding about the parameters.
   
   When files are splitted into multiple files by `approx_max_file_size_bytes`, I have to include `{}` in `filename` parameter.
   
   ```
   filename="some_table.json.{}"
   ```
   Then, the operator uploads temporary files into `some_table.json.01`, `some_table.json.02`, etc...
   I have to handle objects (like loading into BigQuery) with wild_card or equivalent.
   
   I will try that and close if this is true.
   
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org