You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@airflow.apache.org by "eladkal (via GitHub)" <gi...@apache.org> on 2023/03/09 20:31:48 UTC

[GitHub] [airflow] eladkal commented on pull request #29452: Add support of a different AWS connection for DynamoDB

eladkal commented on PR #29452:
URL: https://github.com/apache/airflow/pull/29452#issuecomment-1462741248

   I'm OK with it
   
   
   I still think that this problem is not localized to Dynamo.
   I think we should further explore the option to make it generic for other transfer operators.
   Something like
   
   ```
   class BaseAwsTransferOperator(BaseOperator):
   
       def __init__(
           self,
           *,
           source_aws_conn_id: str | None = AwsBaseHook.default_conn_name,
           dest_aws_conn_id: str | None | ArgNotSet = NOTSET,
           aws_conn_id: str | None | ArgNotSet = NOTSET,
           **kwargs,
       ) -> None:
           super().__init__(
               ...,
               **kwargs,
           )
   
   
   class DynamoDBToS3Operator(BaseAwsTransferOperator):
       ...
   
   
   class S3ToS3Operator(BaseAwsTransferOperator):
       ...
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@airflow.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org