You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "ferruzzi (via GitHub)" <gi...@apache.org> on 2023/03/09 17:46:09 UTC

[GitHub] [airflow] ferruzzi commented on a diff in pull request #29452: Add support of a different AWS connection for DynamoDB

ferruzzi commented on code in PR #29452:
URL: https://github.com/apache/airflow/pull/29452#discussion_r1131381484


##########
airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py:
##########
@@ -78,16 +89,20 @@ class DynamoDBToS3Operator(BaseOperator):
         :ref:`howto/transfer:DynamoDBToS3Operator`
 
     :param dynamodb_table_name: Dynamodb table to replicate data from
+    :param source_aws_conn_id: The Airflow connection used for AWS credentials
+        to access DynamoDB. If this is None or empty then the default boto3
+        behaviour is used. If running Airflow in a distributed manner and
+        source_aws_conn_id is None or empty, then default boto3 configuration
+        would be used (and must be maintained on each worker node).
     :param s3_bucket_name: S3 bucket to replicate data to
     :param file_size: Flush file to s3 if file size >= file_size
     :param dynamodb_scan_kwargs: kwargs pass to <https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Table.scan>
     :param s3_key_prefix: Prefix of s3 object key
     :param process_func: How we transforms a dynamodb item to bytes. By default we dump the json
-    :param aws_conn_id: The Airflow connection used for AWS credentials.
-        If this is None or empty then the default boto3 behaviour is used. If
-        running Airflow in a distributed manner and aws_conn_id is None or
-        empty, then default boto3 configuration would be used (and must be
-        maintained on each worker node).
+    :param dest_aws_conn_id: The Airflow connection used for AWS credentials
+        to access S3. If this is not set then the source_aws_conn_id connection is used.
+    :param aws_conn_id: The Airflow connection used for AWS credentials (deprecated).

Review Comment:
   ```suggestion
       :param aws_conn_id: The Airflow connection used for AWS credentials (deprecated; use source_aws_conn_id).
   ```



##########
airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py:
##########
@@ -78,16 +89,20 @@ class DynamoDBToS3Operator(BaseOperator):
         :ref:`howto/transfer:DynamoDBToS3Operator`
 
     :param dynamodb_table_name: Dynamodb table to replicate data from
+    :param source_aws_conn_id: The Airflow connection used for AWS credentials
+        to access DynamoDB. If this is None or empty then the default boto3
+        behaviour is used. If running Airflow in a distributed manner and
+        source_aws_conn_id is None or empty, then default boto3 configuration
+        would be used (and must be maintained on each worker node).
     :param s3_bucket_name: S3 bucket to replicate data to
     :param file_size: Flush file to s3 if file size >= file_size
     :param dynamodb_scan_kwargs: kwargs pass to <https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Table.scan>
     :param s3_key_prefix: Prefix of s3 object key
     :param process_func: How we transforms a dynamodb item to bytes. By default we dump the json
-    :param aws_conn_id: The Airflow connection used for AWS credentials.
-        If this is None or empty then the default boto3 behaviour is used. If
-        running Airflow in a distributed manner and aws_conn_id is None or
-        empty, then default boto3 configuration would be used (and must be
-        maintained on each worker node).
+    :param dest_aws_conn_id: The Airflow connection used for AWS credentials
+        to access S3. If this is not set then the source_aws_conn_id connection is used.
+    :param aws_conn_id: The Airflow connection used for AWS credentials (deprecated).
+
     """  # noqa: E501
 
     template_fields: Sequence[str] = (

Review Comment:
   We may as well add them to the template fields while we're at it.
   
   ```suggestion
       template_fields: Sequence[str] = (
           "source_aws_conn_id",
           "dest_aws_conn_id",
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org