You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2020/12/29 10:00:24 UTC

[GitHub] [airflow] tuanchris opened a new pull request #13359: Add Parquet data type to BaseSQLToGCSOperator

tuanchris opened a new pull request #13359:
URL: https://github.com/apache/airflow/pull/13359


   As the title suggested, adding parquet data type to BaseSQLToGCSOperator, enabling SQL -> parquet file on GCS.
   
   <!--
   Thank you for contributing! Please make sure that your code changes
   are covered with tests. And in case of new features or big changes
   remember to adjust the documentation.
   
   Feel free to ping committers for the review!
   
   In case of existing issue, reference it using one of the following:
   
   closes: #ISSUE
   related: #ISSUE
   
   How to write a good git commit message:
   http://chris.beams.io/posts/git-commit/
   -->
   
   ---
   **^ Add meaningful description above**
   
   Read the **[Pull Request Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)** for more information.
   In case of fundamental code change, Airflow Improvement Proposal ([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals)) is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in [UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] tuanchris commented on pull request #13359: Add Parquet data type to BaseSQLToGCSOperator

Posted by GitBox <gi...@apache.org>.
tuanchris commented on pull request #13359:
URL: https://github.com/apache/airflow/pull/13359#issuecomment-752848528


   @potiuk Hi Jarek,
   
   Thanks for the reviews. 
   
   I have implemented tests separately for the the three data types. The tests should make sure that the actual Parquet, CSV, and JSON files are uploaded by the class. Let me know what you think. 
   
   Happy new year!!
   
   Tuan


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk merged pull request #13359: Add Parquet data type to BaseSQLToGCSOperator

Posted by GitBox <gi...@apache.org>.
potiuk merged pull request #13359:
URL: https://github.com/apache/airflow/pull/13359


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] github-actions[bot] commented on pull request #13359: Add Parquet data type to BaseSQLToGCSOperator

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on pull request #13359:
URL: https://github.com/apache/airflow/pull/13359#issuecomment-752918447


   The PR is likely OK to be merged with just subset of tests for default Python and Database versions without running the full matrix of tests, because it does not modify the core of Airflow. If the committers decide that the full tests matrix is needed, they will add the label 'full tests needed'. Then you should rebase to the latest master or amend the last commit of the PR, and push it with --force-with-lease.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [airflow] potiuk commented on a change in pull request #13359: Add Parquet data type to BaseSQLToGCSOperator

Posted by GitBox <gi...@apache.org>.
potiuk commented on a change in pull request #13359:
URL: https://github.com/apache/airflow/pull/13359#discussion_r550275535



##########
File path: tests/providers/google/cloud/transfers/test_sql_to_gcs.py
##########
@@ -140,6 +141,7 @@ def test_exec(
 
         cursor_mock.__iter__ = Mock(return_value=iter(INPUT_DATA))
 
+        # Test JSON

Review comment:
       I think the test should be split to three tests now. One test should test one thing only.

##########
File path: tests/providers/google/cloud/transfers/test_sql_to_gcs.py
##########
@@ -160,6 +162,37 @@ def test_exec(
         mock_upload.assert_called_once_with(BUCKET, FILENAME, TMP_FILE_NAME, mime_type=APP_JSON, gzip=False)
         mock_close.assert_called_once()
 
+        mock_query.reset_mock()
+        mock_flush.reset_mock()
+        mock_upload.reset_mock()
+        mock_close.reset_mock()
+        cursor_mock.reset_mock()
+
+        cursor_mock.__iter__ = Mock(return_value=iter(INPUT_DATA))
+
+        # Test parquet
+        operator = DummySQLToGCSOperator(
+            sql=SQL, bucket=BUCKET, filename=FILENAME, task_id=TASK_ID, export_format="parquet", schema=SCHEMA
+        )
+        operator.execute(context=dict())
+
+        mock_query.assert_called_once()
+        mock_write.assert_has_calls(
+            [
+                mock.call(OUTPUT_DATA),
+                mock.call(b"\n"),
+                mock.call(OUTPUT_DATA),
+                mock.call(b"\n"),
+                mock.call(OUTPUT_DATA),
+                mock.call(b"\n"),
+            ]
+        )
+        mock_flush.assert_called_once()
+        mock_upload.assert_called_once_with(
+            BUCKET, FILENAME, TMP_FILE_NAME, mime_type='application/octet-stream', gzip=False
+        )
+        mock_close.assert_called_once()
+

Review comment:
       Is there are any way we can verify that we are actually sending a Parquet file? could we for example try to read it and parse it via Parquet? I am not 100% sure but it seems a bit suspicious that we see the same OUTPUT_DATA being written in both CSV and Parquet case. Maybe I do not understand  it fully but I'd expect different output in both cases if I understand it correctly.

##########
File path: airflow/providers/google/cloud/transfers/sql_to_gcs.py
##########
@@ -198,6 +202,9 @@ def _write_local_data_files(self, cursor):
 
         if self.export_format == 'csv':
             csv_writer = self._configure_csv_file(tmp_file_handle, schema)
+        if self.export_format == 'parquet':
+            parquet_schema = self._convert_parquet_schema(cursor)
+            # parquet_writer = self._configure_parquet_file(tmp_file_handle, parquet_schema)

Review comment:
       II this comment deliberate? 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org