You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by GitBox <gi...@apache.org> on 2022/12/06 11:56:54 UTC

[GitHub] [airflow] Taragolis commented on a diff in pull request #28145: Migrate amazon provider transfer tests from `unittests` to `pytest`

Taragolis commented on code in PR #28145:
URL: https://github.com/apache/airflow/pull/28145#discussion_r1040867411


##########
tests/providers/amazon/aws/transfers/test_sql_to_s3.py:
##########
@@ -146,11 +145,12 @@ def test_execute_json(self, mock_s3_hook, temp_mock):
                 replace=True,
             )
 
-    @parameterized.expand(
+    @pytest.mark.parametrize(
+        "_, params",
         [
             ("with-csv", {"file_format": "csv", "null_string_result": None}),
             ("with-parquet", {"file_format": "parquet", "null_string_result": "None"}),
-        ]
+        ],
     )
     def test_fix_dtypes(self, _, params):

Review Comment:
   With `pytest.mark.parametrize` you do not need to use unnamed variable `_` to show actual tests case.
   You might use `pytest.param(..., id="some value")`. Actually this id could be use as part of run individual test
   
   ```python
       @pytest.mark.parametrize(
           "params",
           [
               pytest.param({"file_format": "csv", "null_string_result": None}, id="with-csv"),
               pytest.param({"file_format": "parquet", "null_string_result": "None"}, id="with-parquet"),
           ],
       )
       def test_fix_dtypes(self, params):
   ```



##########
tests/providers/amazon/aws/transfers/test_sftp_to_s3.py:
##########
@@ -95,9 +85,10 @@ def test_sftp_to_s3_operation(self, use_temp_file=True):
         create_file_task.execute(None)
 
         # Test for creation of s3 bucket
+        s3_hook = S3Hook("aws_default")

Review Comment:
   We do not need any connection here we have mock s3 anyway and better use as keyword
   ```suggestion
           s3_hook = S3Hook(aws_conn_id=None)
   ```



##########
tests/providers/amazon/aws/transfers/test_s3_to_sftp.py:
##########
@@ -84,7 +79,8 @@ def test_s3_to_sftp_operation(self):
 
         with open(LOCAL_FILE_PATH, "w") as file:
             file.write(test_remote_file_content)
-        self.s3_hook.load_file(LOCAL_FILE_PATH, self.s3_key, bucket_name=BUCKET)
+        s3_hook = S3Hook("aws_default")

Review Comment:
   ```suggestion
           s3_hook = S3Hook(aws_conn_id=None)
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org