You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/06/04 14:11:51 UTC

[GitHub] [airflow] nullhack commented on a change in pull request #9140: Fix gzip option of sql_to_gcs operator

nullhack commented on a change in pull request #9140:
URL: https://github.com/apache/airflow/pull/9140#discussion_r435289304



##########
File path: airflow/providers/google/cloud/operators/sql_to_gcs.py
##########
@@ -292,4 +292,4 @@ def _upload_to_gcs(self, files_to_upload):
             hook.upload(self.bucket, tmp_file.get('file_name'),
                         tmp_file.get('file_handle').name,
                         mime_type=tmp_file.get('file_mime_type'),
-                        gzip=self.gzip if tmp_file.get('file_name') == self.schema_filename else False)
+                        gzip=self.gzip if tmp_file.get('file_name') != self.schema_filename else False)

Review comment:
       I don't see an issue there, you can define the schema and add the option to `gzip=True`. It'll perform what is documented `compress file for upload (does not apply to schemas)`, the fact that the schema is passed manually do not interfere, the schema file is created only if a schema filename is provided:
   
   ```python
           # If a schema is set, create a BQ schema JSON file.
           if self.schema_filename:
               self.log.info("Writing local schema file")
               files_to_upload.append(self._write_local_schema_file(cursor))
   ```
   
   If `schema_filename` is set, then the file is created and the documentation still applies.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org